We provide IT Staff Augmentation Services!

Etl Architect/sr Developer Resume

5.00/5 (Submit Your Rating)

Philadelphia, PA

SUMMARY

  • Around 9+ years of total experience in Information Technology including Data Warehouse/Data Mart development using Talend, DataStage and Tableau.
  • Experience in data management, data architecture and database analyst, or similar positions
  • 5 years of experience in using Talend Data Integration tool as applied to BI data analytics, reporting and dashboard projects. 3+ years of experience withTalendOpen Studio, Talend Cloud Hybrid Studio &TalendEnterprise platform for Data Management (V 6.1, 6.3, 6.5) and Data Integration.
  • Good exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, and production support.
  • Solid Knowledge on relational database management Systems, Data Warehousing Basics and Dimensional modeling (Snowflake schema and Star schema).
  • Experience in using Talend Admin Console (TAC) in deploying, scheduling, and Monitoring of the jobs.
  • Designed & published the Tableau dashboards.
  • Developed data pipelines for data processing and tableau data sources using SparkSql and hive.
  • Extensively involved in writing Ad - hoc hive queries all over the time which involves a greater number of data points
  • Performed Business/Subject related analysis over the data and provided data insights to client
  • Worked in Agile environment where we follow Continuous Integration using Bamboo and used GIT as version control to maintain code commits.
  • Good Experience in Talend DI Administration, Talend Data Quality and Talend Data Mapping.
  • Experience in Talend Big Data Integration for business demands to work towards Hadoop and NoSQL databases.
  • Conducted detailed research of higher education processes.
  • Analyzed software issues and developed software applications.
  • Provided assistance on advancement systems with banner products.
  • Implemented gathering of information, functions, and input requirements.
  • Formulated and documented requirements for data, logical processes, and operating systems.
  • Prepared technical specifications and evaluated software products.
  • Supported, programmed, and customized GUI, printed outputs and interfaces.
  • Worked with datasets from Finance, Health Care, Pharmaceutical, Manufacturing and Service Industries.
  • Created complex mappings in Talend using components like tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow etc. And have created various complex mappings.
  • Experience in using storing or analyzing big data through Hive, MongoDB and SparkSql to quickly load, extract, transform and process large and diverse data sets.
  • Able to work under tight deadlines and rapidly changing priorities with proactive, creative & focused approach to business needs with high analytical skill and team playing skills.

TECHNICAL SKILLS

ETL Tools: Talend 7.2.1/7.1/7.0/6.3.1/6.1/5. X,Datastage

DW Tools: Erwin, ER/Studio

RDBMS: Oracle 10G/9i/8.x,Postgres, MS SQL Server, MS Access

Languages: SQL, HiveQL,SaorkSql,PL/SQL, C, C++, VB, Shell Scripting, Java and XML.

Operating Systems: Microsoft Windows, MS-DOS, UNIX and Linux

Development Tools: TOAD, SQL Plus, SQL Developer, Zeppelin, Autosys, WinScp, HP ALM

PROFESSIONAL EXPERIENCE

Confidential, PHILADELPHIA, PA

ETL Architect/Sr Developer

Responsibilities:

  • Worked closely with development lead to finalize on estimated/actual project completion dates for process design, development, testing and implementation
  • Worked with support team to handover ETL to perform UAT and production role
  • Developed process jobs as per design, prepare unit test cases, prepare test data, and Perform unit testing
  • Worked with BIS project manager to finalize on estimated/actual project completion dates
  • Performed final code review before moving to SIT environment and fix issue in warranty Phase
  • Extensively worked on Talend Management Console and Talend Cloud.
  • Deployed jobs to Talend Cloud and migrated them to various environments
  • Performed technical analysis and prepare recommended data acquisition design approach
  • Participated in a team environment for the design, development, and implementation of data warehousing projects
  • Designed the system components / ETL automation for the extract/transform or conversion of data from source systems to the target application (including historization, surrogation, data quality, error handling components)
  • Designed and developed complex ETL mappings, sessions, workflows and identify areas of optimizations
  • Estimation of level of effort associated with new ETL jobs
  • Worked extensively on Oracle, UNIX server and SQL server
  • Sound knowledge of data warehouse architecture theories
  • Worked on ETL architecture and data schemas
  • Developed ETL architecture in accordance with business requirements.
  • Provided technical support in all phases on enterprise architect life cycle.
  • Provided an architecture solution that is reusable, maintainable, and scalable.
  • Experienced in ETL / Data Warehouse Management
  • Developed data migration solution in highly demanding environment and provide hands on guidance to other team members
  • Quickly learnt business processes and how the various business units interact with data
  • Demonstrated critical thinking, analytical skills, and employ judgment to offer thoughtful, concise solutions to business client data problems.
  • Excellent working experience in Waterfall, Agile methodologies. Proficient in performance analysis, monitoring and SQL query tuning using EXPLAINPLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Extensively used tMap component which does lookup & Joiner Functions, tjava, tOracle, txml, tdelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+components to use in my jobs.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more).
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
  • Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Collaborated with functional experts and business users to develop architectural requirements to ensure client satisfaction with solution
  • Provided production support related to ETL schedules, tasks and work with other BI/ IT team members to resolve data refresh issues

Confidential, Piscataway, NJ

Sr. Talend Admin/Developer

Responsibilities:

  • Developed test transactions in adherence to specifications.
  • Analyzed and debugged program code for revision of programs.
  • Maintained detailed documentation of program development stages and coding.
  • Reviewed documents comprising of software installation and operating processes.
  • Conducted technical training session regarding program use.
  • Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the Data Ingestion architecture and Source to Target mapping to load data into Data warehouse
  • Extensively leveraged the Talend Big Data components (tHDFSOutput, tPigmap, tHive, tHDFSCon) for Data Ingestion and Data Curation from several heterogeneous data sources
  • Worked with Data mapping team to understand the source to target mapping rules.
  • Prepared both High level and Low level mapping documents.
  • Analyzed the requirements and framed the business logic and implemented it using Talend.
  • Involved in ETL design and documentation.
  • Developed Talend jobs from the mapping documents and loaded the data into the warehouse.
  • Involved in end-to-end Testing of Talend jobs.
  • Analyzed and performed data integration using Talend Cloud hybrid integration suite.
  • Experience in working with large data warehouses, mapping and extracting data from legacy systems, Redshift/SQL Server UDB databases.
  • Worked on the design, development and testing of Talend mappings.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow etc.
  • Used Database components like tMSSQLInput, tOracleOutput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist.
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Analyzed requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Scheduling and Automation of ETL processes with scheduling tool in TIC and TAC.
  • Scheduled the workflows using Shell script.
  • Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
  • Troubleshoot database, Joblets, mappings, source, and target to find out the bottlenecks and improved the performance.
  • Involved rigorously in Data Cleansing and Data Validation to validate the corrupted data.
  • Migrated Talend mappings/Job /Joblets from Development to Test and to production environment.

Confidential, Edison, NJ

Talend Big data and Tableau Developer

Responsibilities:

  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Extensively worked on creating repositories in GIT and assigning roles to developers and creating same projects in TAC.
  • Broad design, development and testing experience withTalend and BigData Integration Suite and knowledge in Performance Tuning of mappings.
  • Development of staging, Data warehouse scripts and deployment
  • Writing specifications for ETL processes.
  • Designed & published the Tableau dashboards.
  • Developed data pipelines for data processing and tableau data sources using SparkSql and hive.
  • Extensively involved in writing Ad-hoc hive queries all over the time which involves more number of data points
  • Performed particular Business/Subject related analysis over the data and provided data insights to client
  • Prepared High-Level Design documents and Technical Design Document.
  • Analyzed the sources, targets, needed business logics and mapped the data points from heterogeneous sources to target.
  • Developed optimal strategies for distributing the web log dataover the cluster, importing and exporting the stored web log datainto HDFS and Hive using Scoop.
  • Collected and aggregated large amounts of web logdatafrom different sources such as webservers, mobile and network devices using Apache Flume and stored thedatainto HDFS for analysis.
  • Implemented Change Data Capture technology in Talendin order to load deltas to a DataWarehouse.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes usingTalendIntegration Suite.
  • Coordination with Offshore team, providing them guidance and clarifications related to reports, underlying queries
  • Perform validation check and deployment of reports to customer’s staging environment Client, Business objects.

Environment: Talend DI(6.3.1/6.0), Talend BDE, HDFS,UNIX, Oracle, Microsoft SQL Server management Studio, WINDOWS XP

Confidential, Tucson, AZ

Talend Admin/Developer

Responsibilities:

  • Interacting with the clients on a regular basis to discuss day-to-day issues and matters.
  • On-Call/Production Support provided during day-time and off-hours.
  • Played as administrator for setting up development, QA, UAT and PROD environments for Talend, Postgres for development and documenting install plan at client locations.
  • Setup ETL Framework, best practices around Talend for data integration implementation.
  • Responsible for installing Talend on multiple environments, creating projects, setting up user roles, setting up job servers, configure TAC options, adding Talend jobs, job failures, on-call support and scheduling etc.
  • Excellent experience working on tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Developed jobs to expose HDFS files to Hive tables and Views depending up on the schema versions.
  • Created Hive tables, partitions and implemented incremental imports to perform ad-hoc queries on
  • Structured data. Developed jobs to move inbound files to HDFS file location based on monthly, weekly, daily and hourly partitioning.
  • Worked extensively on design, development and deployment of talend jobs to extract data filter the data and load them into datalake.
  • Manage and Review Hadoop log files and hands on with executing Linux and HDFS Commands
  • Responsible for writing Talend Routines in Java.
  • Developed ODS/OLAP data model in Erwin and also created source to target mapping documents.
  • Experience working with web services using tSOAP components for sending XML requests and receiving response XML files. Expertized in reading XMLs files on a loop and sending to webservice end point for generating output XML files. And also used Advanced XML mappers for parsing multiple loop elements.
  • Responsible for digging into PL/SQL code for investigating data issues.
  • Involved in the development of Talend Jobs and preparation of design documents, technical specification documents.
  • Implemented job parallelism in Talend BDE 6.0.1.
  • Experience working with Big data components for extracting and loading data into HDFS file system.
  • Production Support activities like application checkout, batch cycle monitoring and resolving User Queries.
  • Responsible for deploying code to different environments using GIT.

Environment: Talend DI(5.6.2),Talend BDE, UNIX, Oracle, Postgres, Jaspersoft Reports, WINDOWS XP

Confidential

Talend Developer

Responsibilities:

  • Involved in Extraction, Transformation and Loading of data from 10 Source Systems into our data warehouse.
  • Work with the offshore team for the day to day work and review the tasks done by them get the status updates in the daily meetings.
  • Created Jobs for Matching and Merging to cleanse the data coming from Source system.
  • Worked with different Sources such as Oracle, SQL Server and Flat files
  • Developed DataStage Jobs and Sequences as per the business rules and loading requirements.
  • Wrote test case scenarios and unit tested the code developed in DataStage
  • Worked on the project documentation and prepared the Source Target mapping specs with the business logic and also involved in data modeling
  • Worked on migrating data warehouses from existing SQL Server to Oracle database.
  • Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scans.
  • Used Command line Shell to run the Datastage jobs.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Verify the logs to confirm all the relevant jobs are completed successfully and timely and involved in production support to resolve the production issues.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules.
  • Migrated the code and release documents from DEV to QA (UAT) and to Production.

Environment: DataStage, Oracle 10g, SQL Server, ER/Studio, Toad, Windows XP, Unix, TOAD, SQL Developer

Confidential

Datawarehouse Developer

Responsibilities:

  • Source systems data are exported as Power Exchange data maps from IBM Mainframes which maintains legacy data. Data is staged in Load Ready / Staging tables, finally load the given data model using PDO logic.
  • Involved in discussion of various business users Data Analysts from ISD team.
  • Worked in an Agile project life cycle environment.
  • Conduct source system and source table analysis to develop the necessary data
  • Assisted the Data Architect in creating the logical/physical data models.
  • Extensively worked on Teradata tables, created Teradata BTEQ procedures and used TPT Reader/Writer utilities for Bulk Loading.
  • Created several Staging, Historical and Daily Incremental and ETL maps.
  • Worked extensively on Aggregated/Summarized data.
  • Created DataStage Mappings for Error Handling/Audit Balance control flows.
  • Prototyping ETL mappings and workflows for Slowly Changing Dimensions (SDCs),
  • Push Down Optimization (PDO) and corresponding SQL code (Aggregations, Transformations, Rollups, Inserts, Updates, Deletes)
  • Performance tuning and troubleshooting various DataStage mappings.

Environment: Windows 7, Linux, Datastage, Erwin,, Teradata 13, Oracle 10G, Flat Files, XML Files, ESP scheduling tool, Teradata SQL Assistant.

We'd love your feedback!