We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

4.00 Rating

Merriam, KS


  • Sr. ETL developer over eight (8) years of experience in analysis and design, testing, imple - mentation, maintenance, production support and knowledge transfers
  • Experience with teh complete Software Development Life Cycle (SDLC) including requirement, analysis, estimations, design, construction, unit and system testing and implementation
  • Extensive experience with Informatica Power Center/Power Mart 9x/8x/7x in designing and developing complex mappings, mapplets, transformations, workflows, worklets, configured teh Informatica server and scheduled workflows and sessions
  • Used TalenD components - toracle input, toracle output, taggregate row, toracle connection, tfileinputdelimited, tfileoutputdelimited, tsort row, tuniq row and toraclerow
  • Integrated data from various data sources like SQL Server, Oracle 10g, MySQL, flat files into one Oracle 10g database using Oracle Data Integrator and TalenD 5.1.0/5.1.2/5.3.1
  • Implemented data warehousing methodologies for Extraction, Transformation and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor and Repository Server Administration Console
  • Extensive experience in designing and developing complex mappings from varied transformation logic including Unconnected and Connected Lookups, Normalize, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner and Update Strategy
  • Proficient in teh advanced concepts of Ralph Kimball and Bill Inmon methodologies
  • Extensive knowledge in data modeling, data conversions, data integration and data migration with specialization in Informatica Power Center and Erwin
  • Expert level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables and legacy system files
  • Proficient with Oracle 11g/10g/9i, PL/SQL, SQL Plus, SQL Server 2005/2008 with strong experience with database interfaces like PL/SQL developer, SQL Plus and TOAD
  • Significant experience writing SQL queries, Stored Procedures, Cursors, Indexes, and Views
  • Experience working with UNIX shell scripts for automatically running sessions


ETL Tools: Informatica Power Center 9.6.1/8.x/7.x, TalenD 5.1.0/5.2.1/5.3.1 , TalenD Integration Suite

Data Modeling: Erwin 4.0/3.5, Star Schema Modeling, Snow Flake Modeling.

Databases: Oracle 11g/10g/9i, SQL Server 2005/2008, Teradata, DB2

Languages: SQL, PL/SQL

DB Tools: Toad, SQL* Loader, SQL developer

Scheduling Tools: Autosys, Control-M, Tidal

Operating Systems: UNIX, Linux.


Sr. ETL Developer

Confidential, Merriam, KS


  • Created publications, subscriptions, topics using Informatica Data Integration Hub 9.6.2
  • Created custom publications and subscriptions to create XML target files to feed Mercury Gate (transportation management system)
  • Optimized teh OMS (Order Management System) by using teh AS400 source to new Mercury Gate Solution
  • Analyzed teh business requirements and coordinated with teh business analyst’s to develop ETL procedures dat confirmed with teh enterprise standards while maintaining consistency’s for all applications and systems
  • Parsed high-level design specs to teh simple ETL coding and mapping standards
  • Conducted analysis, design and development, test and implementation of Informatica transformations and workflows for extracting teh data from teh multiple legacy systems
  • Discussed strategies for handling various concepts like Error Handling and Slowly Changing Dimensions
  • Worked on Talend components tMSSqlInput, tMSSqlOutput, taggregate row, tMSSqlConnection, tfileinputdelimited, tfileoutputdelimited, toracle input, toracle output, taggregate row, toracle connection, tsort row, tuniq row and tMSSqlRow
  • Worked on Talend ETL to load data from various sources to SQL Server DB. Used tmap, treplicate, tfilerrow, tWaitforFile and various other features in Talend
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and Query Performance.
  • Troubleshot problems by checking sessions/error logs in teh Workflow Monitor, using debugger in mapping designer to debug complex mappings
  • Performance tuned teh workflows by identifying and eliminating teh bottlenecks in targets, sources, mappings, sessions and workflow
  • Installed and configured Informatica Power center client 9.6.1/9.5.1 in Windows environment
  • Created mappings, mapplets, sessions, workflows, and worklets in Informatica Powercenter 9.6.1/9.5.1
  • Created generic worklets for session logging of each workflow process
  • Developed complex SQL queries, unions and multiple Table Joins and experience with Views
  • Hands on experience in tuning maps, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions
  • Extensively used transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence Generator and Stored Procedures
  • Created Windows batch scripts for teh SFTP process and XML file handling
  • Experience in usage Informatica debugger to solve teh problems in mapping. Also, I was involved in troubleshooting teh existing ETL bugs
  • Developed migration plan documents and communicated with teh concerned stakeholders and conducted impact and feasibility analysis

Environment: Informatica Power Center 9.6.1/9.5.1 /9.1/, Informatica Data Integration Hub 9.6.2, TalenD Data Integration 6.1, Oracle 11g/10g, SQL Server 2012, AS400, Visio, UNIX, Windows 7

Sr. ETL Developer

Confidential, Gardner, KS


  • Interacted with teh end users to get teh business requirements, reporting needs and created teh business requirement documents
  • Designed and developed ETL strategies and mappings from source systems to target systems ETL strategies were designed to cater initial loads and incremental loads
  • Worked on OSS (Operational Support System)
  • Facilitated data migration which converts Ciras, Martens and Metasolv data into Tirks
  • Worked with Agile methodology to deliver teh codes
  • Worked on Talend components toracle input, toracle output, taggregate row, toracle connection, tfileinputdelimited, tfileoutputdelimited, tsort row, tuniq row and toraclerow
  • Used context variables in Talend for changing teh connections of Sources and Targets dynamically
  • Executed parallel processing at Source level and Target level to increase teh performance and reduce teh run time in Talend
  • Deployed and scheduled Talend jobs in administration consoles and monitored teh execution
  • Created separate branches with in teh Talend repository for development, production and deployment
  • Used Talend administration console, Talend installation, using context and global map variables in Talend
  • Developed mappings /transformation/joblets and designed ETL jobs/packages using Talend Integration Suite (TIS) in Talend 5.3.1
  • Worked on Talend ETL to load data from various sources to Oracle DB. Used tmap, treplicate, tfilerrow, tWaitforFile and various other features in Talend
  • Designed teh architecture of Talend jobs in parallel from teh execution stand point to reduce teh run time
  • Used Talend joblets and various commonly used Talend transformations components like tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput and tHash output and many more
  • Developed routines in codes in repository using Java and called them in teh Expression Filter
  • Used max JVM parameters and cursor size in Talend as a part of performance tuning
  • Extracted data from multiple operational sources for loading staging area, data warehouse, data marts using SCD’s (Type 1/Type 2) loads
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Aggregator, Joiner, Stored Procedure, Lookup (Connected and Unconnected) and Union to develop robust mappings in teh Informatica designer
  • Performed tuning at both teh database and on teh Informatica level
  • Developed static and dynamic parameter files
  • Involved in performance tuning at source, target, mappings, sessions, and system levels
  • Worked on test plans and executed it at unit testing and also supported for system testing, volume testing and user testing
  • Extracted and transformed data from various sources such as flat files, COBOL files and transferred data to teh target data warehouse
  • Used Informatica to load data from SQL database, Excel spreadsheets, into teh target Oracle database
  • Performed end-to-end ETL development of teh data mart. Responsible for data quality analysis to determine teh cleansing requirements and designed teh ETL process to do teh data loads
  • Used various tasks like session task, event wait task, decision task, email task and command tasks
  • Designed teh ETL flow diagrams in Visio to arrive at teh schedule for Tidal jobs
  • Developed complex Stored Procedures, Packages, Triggers, Cursors and wrote shell scripts to automate teh Informatica process
  • Worked on teh PL/SQL code optimization techniques and Error Handling mechanisms
  • Involved in unit testing and system testing teh ETL jobs and worked with QA in resolving teh defects
  • Precise documentation was done for all mappings and workflows

Environment: Informatica Power Center 9.1/8.6.1, TalenD Data Integration 5.1.0/5.1.2/5.3.1 , Oracle 11g/10g, Visio, UNIX, Windows 7 and Tidal

We'd love your feedback!