We provide IT Staff Augmentation Services!

Etl/integrationdeveloper Resume

2.00/5 (Submit Your Rating)

San Diego, CA

PROFESSIONAL SUMMARY:

  • 6+years’ Experience on ETL integration Enterprise Edition for Data integration/Data Quality.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of sourcesystems including Oracle 10g, DB2, Netezza, SQL server, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands onInformatica ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND usingtMap, tJoin, tReplicate, tParallelize, tConvertType,tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDieetc
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Experienced in Cloud technologies like AWS, S3, Redshift.
  • Strong understanding of the entire AWS Product and Service suite primarily EC2, S3, Lambda, Redshift, EMR(Hadoop) and other monitoring service of products and their applicable use cases, best practices and implementation, and support considerations.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced in integration of various data sources like Oracle SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensively worked with TAC (Talend Administrator Console) for scheduling jobs using the execution plan.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used NetezzaUtilities to load and execute SQL scripts using Unix
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
  • Experienced in using Automation Scheduling tools like Autosys and Control-M.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from mySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.

TECHNICAL SKILLS:

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Talend, TOS, TIS, Informatica Power Center 9.x/8.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), Ab-Initio.

Databases: Oracle 12c/11g, MS SQL Server 2012, DB2 v8.1, Netezza.

Methodologies: Data Modeling - Logical Physical, Dimensional Modeling - Star / Snowflake

Languages: SQL, PL/SQL, UNIX, Shell scripts, JSP, Web Services, Eclipse

Scheduling Tools: Autosys, Control-M

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, ICDQ.

PROFESSIONAL EXPERIENCE:

Confidential, San Diego, CA

ETL/IntegrationDeveloper

Responsibilities:

  • Analyzed and performed data integration using Talend open integration suite.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Created jobs to pass parameters from child job to parent job.
  • Monitored the daily runs, weekly runs and adhoc runs to load data into the target systems.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehousedatabase.
  • Involved in design and development of data validation, load process and error control routines.
  • Worked on Talend ETL and used features such as Context variables &Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow,tjavarow, tConvertType, tjava, txmlMap, tdelimited etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, tFileDelete,tFileRename, tFileInputXML.
  • New Integration Tool we are on boarded is Workato (Integration tool) primary integration tool for app-to-app integration and select data ETL/integration processes.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Worked on legacy application for AD within the organization information using informaticapowe center 9.1.
  • Work with Application Owners to configure and/or make necessary custom development to the applications as needed for the integration.
  • Used SQL tools TOAD to run SQL queries and validate the data in warehouse.
  • Worked in Amazon web services (AWS) cloud environment. Migrated from on-premises to cloud (AWS) Redshift.
  • Development of copying Data from AWS S3 to Redshift using the Talend Process.
  • Schedule the jobs using TAC.
  • Experience in migrating existing databases from on premise to AWS Redshift using various AWS services.
  • For Ticketing Tool we are using Sales Force.
  • Working on Agile Environment.
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Giving Production support for Already Existing Workflows in informatica and Talend.

Environment: Talend Data Integration 7.0, InformaticaPowercenter 9.1, Netezza,Workato, SQL Server 2008, Unix, XML,UNIX Scripting, Toad,AWS,Redshift,S3,Oracle 11g, SQL Serve.

Confidential, Round Rock, TX.

Talend Developer

Responsibilities:

  • Worked with Data mapping team to understand the source to target mapping rules.
  • Involved in ETL design and documentation.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Analyzed and performed data integration using Talend open integration suite.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend ETL and used features such as Context variables &Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow,tjavarow, tConvertType, tjava, txmlMap, tdelimited etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, tFileDelete,tFileRename, tFileInputXML.
  • Worked on improving the performance of Talend jobs.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Importing of Talend jobs using Talend Admin Console.
  • Created jobs to pass parameters from child job to parent job.
  • Monitored the daily runs, weekly runs and adhoc runs to load data into the target systems.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end - to-end testing of jobs.
  • Wrote SQL queries to take data from various sources and integrated it with Talend.
  • Scheduled the Talend jobs using AWS Cloud watch.

Environment: Talend Data Management Platform 6.4, DataStage, Netezza, SQL Server 2008, Unix, XML,UNIX Scripting, Toad, Oracle 10g, SQL Serve.

Confidential

Informatica Developer

Responsibilities:

  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints .
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database .
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Used legacy systems , Oracle , and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
  • Involved in ETL process from development to testing and production environments.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: Informatica Power Center7.1, Ora cle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.

Confidential

Oracle Developer

Responsibilities:

  • Involved in creating database objects like tables, stored procedures, views, triggers , and user defined functions for the project which was working on.
  • Analyze the client requirements and translate them into technical requirements.
  • Gathered requirements from the end user and involved in developing logical model and implementing requirements in SQL server 2000 .
  • Data migration ( import & export - BCP ) from text to SQL server.
  • Responsible for creating reports based on the requirements using reporting services 2000.
  • Identified the database tables for defining the queries for the reports.
  • Worked on SQL server queries , stored procedures, triggers and joins .
  • Identified and defined the datasets for report generation.
  • Formatted the reports using global variables and expressions.
  • Deployed generated reports onto the report server to access it through browser.
  • Maintained data integrity by performing validation checks.

Environment : MS SQL 2000, Windows server 2000, SQL Query Analyzer, MS Access 2000 & Windows NT platform.

We'd love your feedback!