We provide IT Staff Augmentation Services!

Etl Developer Resume Profile

PROFESSIONAL SUMMARY:

  • Over 7 years of experience in Information Technology including Data Warehouse/Data Mart development using ETL/Informatica Power Center/IBM Infosphere DataStage 7.5, 8.1 DataStage, Quality Stage Server, Parallel EE .
  • Worked on various domains including Insurance, Finance, Banking, Pharmaceutical and Retail.
  • Good exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
  • Worked in various Heterogeneous Source Systems like Oracle, Teradata, MS SQL Server, Flat files and Legacy systems.
  • Expert in DataWarehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology.
  • Practical understanding of the Data modeling Dimensional Relational concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Good understanding of views, Synonyms, Indexes, Joins and Sub-Queries. Extensively used Cursors and Ref Cursors.
  • Expertise in creating very detailed design documents and performing Proof of Concepts POC .
  • Extensively created mapplets, common functions, reusable transformations, look-ups for better usability.
  • Expertise in building/enhancing/migrating Universes, Retrieving data using Universes, Personal data files
  • Sound experience in Performance tuning of ETL process. Reduced the execution time for huge volumes of data for a company merger projects.
  • Extensive experience in Tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulkload.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
  • Extensively used SQL, PL/SQL in writing StoredProcedures, Functions, Packages and Triggers.
  • Great Expertise in using Exception Handling strategies to capture errors and referential integrity constraints of records during loading processes to notify the exception records to the source team.
  • Experience in UNIX shell scripting, job scheduling and communicating with server using pmcmd.
  • Delivered all the projects/assignments within specified timelines.
  • Effectively communicate with business, projectmanager and team members.
  • Strong ability to work within a demanding and aggressive project schedules and environments.
  • Excellent analytical, problem solving skills and a motivated team player with excellent inter-personal skills.
  • Good team player, strong interpersonal and communication skills combined with self-motivation, initiative.

TECHNICAL SKILLS:

ETL Tools

Informatica Power Center 9.x/8.x/7.x/6.x ,IBM Infosphere DataStage 9.x/8.x/7.x

Databases

Oracle 10g/9i/8i/7i, IBM UD2 DB2 , Sybase SQL Server 11.0, MS SQL Server 6.5/7.0/9.0, MS Access 2000, Teradata

Programming Languages

C, C , SQL, PL/SQL, UNIX, XML

BI Tools

Business Objects 6/5, Micro Stratergy 9

Web Technologies

JavaScript 1.2, HTML 4.0

Others

Erwin 4.1.2/3.5.2, TOAD, ,SQL Loader, MS Office, Smart FTP,Ultra Edit, Autosys, Unicenter, Control-M,Quality Center, MS.Visio,

Operating Systems

Sun Solaris, Windows NT 4.0, Windows 95/98/2000/XP,HP Unix, MS DOS 6.22, IBM-PC Compatibles

PROFESSIONAL EXPERIENCE:

Confidential

Sr.ETL Developer

Responsibilities:

  • Responsible for design and development of Brokerage Data Warehouse multiple projects leveraging Informatica Power Center ETL tool, Oracle and DB2 database, Business Objects reporting tools.
  • Actively participated in a team in the logical and physical design of the data warehouse.
  • Closely associated with data architect in resolving the data issues.
  • Developed the Informaticamappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and unit tested the mappings.
  • Loaded data in to Teradata Target tables using Teradata Utilities FastLoad, MultiLoad,FastExport .Queried the Target database using TeradataSQL and BTEQ for validation.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Assisted in developing different kinds of Grid reports using Micro-Stratergy.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the recquirement
  • Worked with the Informatica Administrator in migrating the mappings, sessions, source/target definitions from the development repository to the production environment.
  • Designed and developed Jobs by using IBM DataStage that extract data from the Oracle database, transform it as per business requirement and populate the data into flat files.
  • Worked with DataStage Director for testing and monitoring the executable jobs
  • Involved in performancetuning of the Informatica sessions and workflows.
  • Created the reusable transformations for better performance.
  • Created and reviewed the Design and CodereviewTemplates.
  • As a part of the testingteam, Involved in conducting the Unittests and Systemtests.
  • Scheduling jobs using Unicenter to automate the InformaticaSessions.
  • Developing control files, StoredProcedures to manipulate and load the data into Oracledatabase

Environment: Informatica Power Center 9.1,IBM Infosphere DataStage 7.5, 8.1 DataStage, Quality Stage Server, Parallel EE ,Oracle 9i,Flat files, MS SQL server 2005/ 2000, SSIS, Erwin 4.1.2,Cognos,,Winscp, Control-M, MS. Visio, Harvest, Mercury Quality Center, Shell Script, UNIX.

Confidential

Sr. ETL Developer

Responsibilities:

  • Responsible for design and development of Sales Data Warehouse project leveraging Informatica Power Center ETL tool, Teradata database, Micro Stratergy reporting tool.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Responsible for Data Warehouse Architecture, ETL and coding standards.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Conducted a series of discussions with team members to convert Business rules into Informaticamappings.
  • Extracted Erwin physical models into repository manager
  • Involved in testing universes and reports for correct mapping of the objects and data correctness
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created Mapplets and used them in different Mappings.
  • Developed stored procedure to check source data with warehousedata and if not present, write the records to spool table and used spool table as lookup in transformation.
  • Done extensive bulk loading into the target using Oracle SQL Loader.
  • Involved in doing error handling, debuggingand troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Managed Scheduling of Tasks to run any time without any operator intervention.
  • Leveraged workflow manager for session management, database connection management and scheduling of jobs.
  • Generated List reports, Cross-tab reports, Drill through reports using Cognos Impromptu tools querying Database
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
  • Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations
  • Experienced in Debuggingand Performancetuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
  • Delivered all the projects/assignments within specified timelines.

Environment: Informatica Power Center 8.6/7.1.2,IBM Infosphere DataStage 7.5, 8.1 DataStage, Quality Stage Server, Parallel EE , Oracle 10g, SQL server 2008, SSIS, Teradata and MS Access 2000, Cognos, Aqua Data Studio , Erwin 4.1.2, Toad, Winscp, Autosys, Rational Clear Case, Rational Req.pro, Rational Clear Quest, UNIX

Confidential

ETL developer

Responsibilities:

  • Responsible for design and development of Brokerage Data Warehouse multiple projects leveraging Informatica Power Center ETL tool, Oracle and DB2 database, Business Objects reporting tools.
  • Actively participated in a team in the logical and physical design of the data warehouse.
  • Closely associated with data architect in resolving the data issues.
  • Developed the Informaticamappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and unit tested the mappings.
  • Loaded data in to Teradata Target tables using Teradata Utilities FastLoad, MultiLoad,FastExport .Queried the Target database using TeradataSQL and BTEQ for validation.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Assisted in developing different kinds of Grid reports using Micro-Stratergy.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the recquirement
  • Worked with the Informatica Administrator in migrating the mappings, sessions, source/target definitions from the development repository to the production environment.
  • Involved in performancetuning of the Informatica sessions and workflows.
  • Created the reusable transformations for better performance.
  • Created and reviewed the Design and CodereviewTemplates.
  • As a part of the testingteam, Involved in conducting the Unittests and Systemtests.
  • Scheduling jobs using Unicenter to automate the InformaticaSessions.
  • Developing control files, StoredProcedures to manipulate and load the data into Oracledatabase

Environment: Informatica Power Center 8.1, Oracle 9i,Flat files, MS SQL server 2005/ 2000, SSIS, Erwin 4.1.2,Cognos,,Winscp, Control-M, MS. Visio, Harvest, Mercury Quality Center, Shell Script, UNIX.

Confidential

ETL Developer

Responsibilities:

  • Analyzed existing system prepared and presented an Impact Analysis Document.
  • Actively participated in the design, development and implementations of the Enterprise Data Warehouse EDW process and Data mart.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Configured the Repositories.
  • Developed various mappings by using reusable transformations.
  • Executed the workflow using pmcmd command in UNIX.
  • Generated reports using Cognos Impromptu tools querying database.
  • Improved the mapping performance using SQLoverrides.
  • Created Mapplets and used them in different mappings.
  • Used Debugger to test the data flow and fix the mappings.
  • Implemented Errordatavalidations using Errorhandlingstrategy techniques.
  • TuningInformaticaMappings and Sessions for optimum performance.
  • Implemented all the ETL Architecture standards for the new technology Teradata
  • Post-Production support to monitor jobs in new environment.
  • Developed mappings, sessions and workflows for the new ETL process followed with Quality standards.
  • Responsible for QAmigration and Production Deployment.
  • Used various transformations for dataManipulation.
  • Most of the transformations were used like SourceQualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy and Normalizer.
  • Developed and implemented the long term IT goals and strategies.
  • Participate in the rotation of 24/7 support.

Environment: Informatica Power Center 7.1.2, Oracle 7i, Flat files, SQL server 7.0, Analytics Server 3.1.2, Cognos Impromptu, Shell / Perl Script, Sun Solaris OS, PL/SQL, Toad 7.0, Erwin 3.5.2.

Confidential

ETL Developer

Responsibilities:

  • Developing Informatica mappings and shell scripts for loading trading and clearing data from various clients.
  • Actively participated in a team in the logical and physical design of the data warehouse.
  • Closely associated with data architect in resolving the data issues.
  • Developed the Informaticamappings using various transformations, Sessions and Workflows. SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
  • Involved in creating stored procedure to support recovery.
  • Responsible for working closely with the Informaticaadministrator to migrate Source and Target definitions, Mappings, Workflows, and Flat Files from development environment to the productionenvironment.
  • Extensively used the Lookup and Update Strategy Transformations for implementing the Slowly Changing Dimensions.
  • Used different tasks like Email, Command task.
  • Worked with the Informatica Administrator in migrating the mappings, sessions, source/target definitions from the development repository to the production environment.
  • Involved with the DBA in performancetuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
  • Involved in the mirroring of the staging environment to production.
  • Created and reviewed the Design and CodereviewTemplates.
  • As a part of the testingteam, Involved in conducting the Unittests and Systemtests.
  • Scheduling jobs using Autosys to automate the InformaticaSessions.
  • Optimizing the Autosys batch flow.
  • Developing control files, StoredProcedures to manipulate and load the data into Oracledatabase
  • Optimizing queries using SQLNavigator

Environment: Informatica Power Center 6.12, Windows NT, Sun Solaris, Shell Scripts, Oracle 8i, PL/SQL, Pro C, SQL Loader, Discoverer 2000, SQL Server, Autosys, DB2, Erwin 3.5.2

Hire Now