We provide IT Staff Augmentation Services!

Etl Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Over 6 years of result driven and diversified experience in Analysis, Design, Development, Integration and Maintenance of various data warehouse applications with various clients, I am looking for an opportunity that utilizes my business analysis, technical expertise and skills in design and development using Teradata and Informatica.
  • Proficiency in building BTEQ scripts in Teradata for transforming data from source to target database
  • Experience in loading data using Fastload, Multiload, TPT and Fast Export Teradata utilities
  • Strong understanding of Teradata architecture, components, utilities and data access methods
  • Expertise in Informatica Power Center 8.x/7.x for ETL (Extract, Transform and Load) and business intelligence systems involving large data volumes
  • Extensive experience with Informatica tools: Source Analyzer, Warehouse Designer, Transformation Developer, Mapping/Mapplet Designer, Workflow Manager and Workflow Monitor
  • Expertise in creating and scheduling sessions and workflows using Informatica Server Manager and scheduling the BTEQ scripts in TD through AppWorx scheduling tool.
  • Experience in developing ETL processes using major RDBMS like Teradata and Oracle
  • Experience in extracting data from various sources including flat files, Excel spread sheets and RDBMS tables
  • Strong knowledge in Data warehousing architectures, RDBMS concepts, Relational Modeling and Dimensional Modeling
  • Experience in designing and maintaining slowly changing Dimensions
  • Experience in supporting business user’s reporting needs using Teradata and Informatica
  • Experience working with Business Analysts, Business Managers and Users
  • Good understanding of Software Development Life Cycle (SDLC)
  • Sound understanding of ITIL Processes - Incident, Problem and Change Management Processes
  • Experience working with ticketing tools viz. BMC Remedy and HP Service Desk
  • Experience in documentation using Microsoft Office Tools
  • Ability to learn and apply new technologies
  • Good communication, analytical and problem-solving skills

TECHNICAL SKILLS

Data Warehousing Technologies: Teradata TD14, TD13 and TD12

ETL Tools: Informatica Power Center 8.x, 7.x and 6.x (Designer, Workflow Manager and Workflow Monitor)

RDBMS: Teradata TD13 and TD12, Oracle 11g, 9i and 8i, MS SQL

DB Tools: SQL*Plus, Oracle SQL Developer, TOAD 8.0

Operating System: Sun Solaris, IBM AIX, RedHat Linux, SUSE Linux, MS Windows Server 2000, 2003, 2008, XP & 7

Programming and Scripting: SQL, PL/SQL, Shell Scripting

SDLC Methodologies: Agile model, Waterfall model

Frameworks: ITIL

PROFESSIONAL EXPERIENCE

Confidential

ETL Developer

Responsibilities:

  • Understanding existing business model and customer requirements and identifying the source data that needed to be moved to the data warehouse.
  • Responsible for Extracting, Transforming and Loading data from Flat files and RDBMS tables and placing them into Data warehouse.
  • Prepared detailed design documentation for the ETL process using standard templates.
  • Prepared Business Requirement Document (BRD).
  • Developed mappings using Informatica mapping designer and BTEQ scripts using Teradata BTEQ to load data from source to staging area, staging area to working database and working database to target data warehouse (TDW).
  • Design, Development and Testing of Mappings, Mapplets using Informatica PowerCenter8.6/7.1
  • Created complex BTEQ scripts using various functions like CASE, COALESCE, CAST, GROUP BY, NULLIFZERO, TRIM, LOWER, UPPER, LOG, SQRT and used joins like INNER JOIN, LEFT OUTER JOIN and FULL OUTER JOINS.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and BTEQ scripts using APPWORX scheduling tool.
  • Extensively involved in parameterization of all the workflow objects.
  • Used Fast load and Multi load for loading bulk data.
  • Root cause analysis for Informatica loads and Teradata BTEQ scripts using Error logs.
  • Worked on Informatica tools - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer, and Transformations.
  • Created sessions and workflows to run the mappings and set the session parameters to improve the load performance using Informatica Server Manager.
  • Validate the output according to the specifications.
  • Involved in creating Change Requests as a part of Change and Release management activities and participated in CAB (Change Advisory Board) meetings.

Confidential

ETL Developer

Environment: Informatica Power Center 7.1, Informatica, Oracle 10g/9i, SQL Server 2005, SQL Query Analyzer, Batch Scripts, IBM AIX.

Responsibilities:

  • Extracted data from various sources, applied business logic to load them in to the Data Warehouse.
  • Worked on Informatica 7.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Involved in design and development of complex ETL mappings.
  • Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression, aggregator and sequence generator transformations in extracting data.
  • Used Informatica Designer to create complex mappings, transformations, source and target tables.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions.
  • Worked extensively with various Databases like Teradata, Oracle 9i, 10g, SQL Server.
  • Supported production jobs, data enhancements and code fixes.
  • Involved in performance tuning of targets, sources, mappings, and sessions.
  • Developed and managed shell scripts extensively for job execution and automation.

Confidential

ETL Developer

Environment: Informatica Power Center 7.1 (Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, Workflow Monitor), Oracle 9i, SQL*loader, TOAD, Windows 2000, Unix (Solaris), Unix Shell programming.

Responsibilities:

  • Analyzed the specifications and identified the source data that needed to be moved to the data warehouse.
  • Prepared detailed design documentation for the ETL process using standard templates.
  • Using Informatica to populate the data mart from large flat files, excel sheets and relational databases.
  • Worked with Power Center Designer tools in developing mappings and mapplets to extract and load the data from flat files, excel sheets to target.
  • Created and tested data model mappings from source systems to target systems. Implemented Join, Expression, Aggregate, Rank, Lookup, Update Strategy, Filter and Router Transformations in Mappings.
  • Worked on Designer tool - Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, and Transformation developer.
  • Created reusable transformations and mapplets in mapping.
  • Used Target load order to load the data into target warehouse.
  • Used SQL*Loader for loading bulk data
  • Created and monitored sessions and batches to run the mappings and set the session parameters to improve the load performance.
  • Schedule and run extraction and load processes and monitor Sessions by using Informatica Workflow Manager.
  • Using slowly changing Dimensions to update warehouse as per the requirements.

Confidential

ETL Developer

Environment: Informatica Power Center 6.2 (Repository Manager, Designer, Workflow Monitor, Workflow Manager), Oracle 8i and Windows NT.

Responsibilities:

  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Worked on ETL mapping design using Informatica.
  • Used heterogeneous data sources, like flat files and Oracle.
  • Developed various mappings with the collection of all sources, targets, and transformations using designer.
  • Implemented Joiner, Expression, Aggregate, Rank, Lookup, Update Strategy, Filter and Router Transformations in Mappings.
  • Monitored workflows using workflow monitor in Informatica.
  • Used target load order and Event Based loading.
  • The data is transformed, validated, and loaded into the Data warehouse using Power Center.

We'd love your feedback!