We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • 9+ years of experience on Data warehouse &Business intelligence technologies, ETL development and designing, data modelling, Business/Data Analysis and project management.
  • Excellent working experience & sound knowledge on Informatica and Talend ETL tool. Expertise in reusability, parameterization, workflow design, designing and developing ETL mappings and scripts.
  • Strong proficiency in writing SQL (including Oracle, Teradata, SQL Server, Netezza, DB2).
  • 7+ years of good exposure on optimizing the SQL and performance tuning.
  • Excellent knowledge on Business Intelligence tools such as SAP Business objects& Microstrategy.
  • Solid understanding of both technical and functional data quality concepts.
  • Excellent experience on Oracle 9i/10g/11g, SQL server 2005, DB2 UDB, Netezza & Teradata.
  • Experience in developing analytics, dashboards, adhoc reports, published reports, static reports through SAP BO, Xcelsius. Well versed with all phases of SDLC.
  • Expertise in UNIX shell scripting
  • Extensive Knowledge of RDBMS concepts, PL/SQL, Stored Procedure and Normal Forms.
  • Expertise in setting up load strategy, dynamically passing the parameters to mappings and workflows in Informatica & workflows & data flows in SAP Business Objects data services integration tools.
  • Experience in developing end to end business solution by providing design solutions on technologies to be used.
  • Experience in creating reusable transformations and complex mappings
  • Experience in versioning and managing/deployment groups in Informatica environment
  • Expertise in working with Teradata and using TPT
  • Experience on Teradata utilities fastload, multiload, tpump to load data
  • Proficiency in writing Teradata queries
  • Experience on working with native TPT and BTEQ on Unix/Linux
  • Well experienced in functional and technical systems analysis & design, systems architectural design, presentation, process interfaces design, process data flow design, and system impact analysis and design documentation and presentation.
  • Experience with dimensional modelling with both star and snow flake schemausing Ralf Kimball methodology. Well versed with extract and loads of large data volumes.
  • Proficiency in SQL in both relational and star - schema environments
  • Proven experience in project and team leading with zero defect delivery. Equally comfortable working independently as well as in a team environment
  • Experience of handling large scale projects.
  • Excellent experience on implementing slowly changing dimensions - Type I, II & III
  • Demonstrated ability to lead projects from planning through completion under fast paced and time sensitive environments
  • Excellent knowledge of planning, estimation, project coordination and leadership in managing large scale projects.
  • Good interpersonal communicator who effectively interacts with clients and customers
  • Decisive, energetic and focused team lead/player who takes ownership and responsibility for requirements and contributes positively to the team.
  • Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, training, implementation and post-implementation review.
  • Experience of CMM Level-5 project processes, methodology and Implementation
  • Competent technical knowledge with 8+ years of hands on experience. Strong problem solver who can design & develop solutions and assist developers with issues.

TECHNICAL SKILLS

Programming Languages: SQL,T-SQL, PL/SQL, stored procedures

DBMS: Oracle 9i/10g,11g, SQL-Server 2005, Access, Teradata R12, R13, R14. DB2 UDB, Netezza

Reporting Tools: Webi, DeskI, SAP BOXI, Microstrategy, SAP Business Objects

ETL Tools: Informatica 7.1, 8.1 & 8.6, 9.1, 9.5. SAP BODI, SAP BODS, Talend, SAS

Design/CASE Tools: MS Project, Visio

Operating Systems: Win 2003/XP/2000/NT, MS DOS, UNIX

Miscellaneous: MS VSS, WEBI, universe and ERWIN, TOAD 7.x

PROFESSIONAL EXPERIENCE

Senior ETL Developer

Confidential, Charlotte, NC

Responsibilities:

  • Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Designed and developed LINUX scripts using NZSQL, NZLOAD utilities to migrate data from Oracle to Netezza and Teradata to Netezza.
  • Extensively used netezza utilities like NZLOAD and NZSQL.
  • Wrote heavy netezza SQL scripts to load the data between netezza tables and implemented ELT process for SCD type 1 and SCD type 2 dimensions
  • Wrote Teradata SQL, BTEQ, MLoad, OLELoad, FastLoad, and FastExport scripts in Unix and build Unix shell script to perform ETL interfaces BTEQ, FastLoad or FastExport for various business users implementations in production.
  • Provided L1, L2 and L3 support for the ETL jobs for daily, monthly and weekly schedules for the banks various applications.
  • Developed (BTEQ, MLOAD, FASTLOAD, TPUMP and FASTEXPORT) scripts for loading and unloading purposes.
  • Involved in directing to migrating the data from different data barns to Teradata and in mapping the data from source to staging to target the data bases.
  • Proficient in importing/exporting large amounts of data from files to Teradata and vice versa.
  • Developed the DW ETL scripts using BTEQ, Stored Procedures, Macros in Teradata
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Created numerous scripts with Teradata utilities BTEQ, MLOAD and FLOAD, TPT and TPUMP
  • Extensively worked on ETL Informatica transformations effectivelyincluding - Source Qualifier, Connected - UnconnectedLookup, Filter, Expression, Router, Union, Normalizer, Joiner, Update, Rank, Aggregator, StoredProcedure, SorterandSequenceGeneratorand createdcomplexmappings.
  • Extensivelyworked on developingInformaticaMappings, Mapplets, Sessions, Workflows and Worklets for data loadsfromvarious sources such as Oracle, ASCII delimited Flat Files, EBCDIC files, XML, COBOL, DB2, and SQL Server.
  • Wrote complex SQL queries on Netezza and used them in lookup SQL overrides and Source Qualifier overrides.
  • Extracted data from various sources like Oracle, Netezza and flat files and loaded into the target Netezza database.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Worked on writing various stored procedures using PL/SQL on Oracle database to achieve complex functionality.
  • Worked on creating various mapplets in Informatica to provide re-usability.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Worked closely with the project team to formulate and implement a flexible system design that meets functional requirements.
  • Created the Staging mappings for delta detection using the different Transformations (Expression, Lookup, Joiner, Filter, Router, Stored Procedure etc.)
  • Free hand SQL, reports with calculation context and drill down features.
  • Requirements analysis, Design, Code and Design reviews.

Environment: Informatica 9.5 on UNIX, Teradata R14, Netezaa, Oracle 11g, PL/SQL, TOAD, UNIX, FASTLOAD, MULTILOAD, TPUMP, TPT, Aginity, Nzload, Nzsql, Unix

Senior Datawarehouse Developer

Confidential, MN

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings
  • Worked on writing various stored procedures using PL/SQL on Oracle database to achieve complex functionality.
  • Participated in the SIT testing and supported the System Acceptance and User Acceptance testing.
  • Developed, tested and deployed full code package for EDW in SVN & Visual Source Safe.
  • Developed ETL deployment plan for the production environment & provided run books to production support team.
  • Extensively worked on ETL Informatica transformations effectivelyincluding - Source Qualifier, Connected - UnconnectedLookup, Filter, Expression, Router, Union, Normalizer, Joiner, Update, Rank, Aggregator, StoredProcedure, SorterandSequenceGeneratorand createdcomplexmappings.
  • Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, FastLoad, Multi Load, Fast Export, Tpump on UNIX/Windows environments and running the batch process for Teradata.
  • Extensivelyworked on developingInformaticaMappings, Mapplets, Sessions, Workflows and Worklets for data loadsfromvarious sources such as Oracle, ASCII delimited Flat Files, EBCDIC files, XML, COBOL, DB2, and SQL Server.
  • Solid experience in performancetuning on TeradataSQL Queries and Informatica mappings.
  • Wrote complex SQL queries on Netezza and used them in lookup SQL overrides and Source Qualifier overrides.
  • Extracted data from various sources like Oracle, Netezza and flat files and loaded into the target Netezza database.
  • Worked on Teradata SQL Assistant, Teradata administrator, Teradataview point and BTEQ scripts.
  • Created Operation Manual documents and developed and updated other technical documentations.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Used SAS Library Engines and Remote Library Services to store data in external data structures and on remote computer platforms
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
  • Solid Expertise in using both Connected and Un connected Lookup transformations
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Worked on writing various stored procedures using PL/SQL on Oracle database to achieve complex functionality.
  • Worked on creating various mapplets in Informatica to provide re-usability.
  • Worked effectively with Business Analysts and Project managers for assigned tasks and deliverable timelines.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
  • Involved in defining the overall strategy for design and standards by creating many checklists for smooth deployments.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Worked on extracting real time data and loading to data warehouse.
  • Scheduling and performance tuning of Informatica mappings.
  • Worked closely with the project team to formulate and implement a flexible system design that meets functional requirements.
  • Created data models diagrams using Visio and Erwin for EDW and datamarts.
  • Collaborated with data modellers, ETL developers in the creating the Data Functional Design documents.
  • Created the Staging mappings for delta detection using the different Transformations (Expression, Lookup, Joiner, Filter, Router, Stored Procedure etc.)
  • Free hand SQL, reports with calculation context and drill down features.
  • Requirements analysis, Design, Code and Design reviews.

Environment: Informatica 9.1 on UNIX, Teradata R13, Netezaa, Oracle 11g, PL/SQL, TOAD, UNIX, FASTLOAD, MULTILOAD, TPUMP, TPT, SAS.

Senior Systems Developer

Confidential, Gardner, KS

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings
  • Developedcomplex ETL mappings and worked on the transformations likeSource qualifier, Joiner, Expression, Sorter, Aggregator, Sequencegenerator, Normalizer, ConnectedLookup, UnconnectedLookup, Update Strategy and StoredProcedure transformation.
  • ImplementedSlowlyChanging Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Created stored procedures on Netezza and Oracle database to achieve complex ETL logic.
  • Worked on loading data to Netezza datawarehouse and worked on maintaining datawarehouse on Oracle and DB2 databases.
  • Worked on loading the data fromdifferent sources likeOracle, DB2, EBCDIC files (CreatedCopy booklayouts for the source files), ASCII delimited flat files to Oracle targets and flat files.
  • Worked on writing various stored procedures using PL/SQL on Oracle database to achieve complex functionality.
  • Participated in the SIT testing and supported the System Acceptance and User Acceptance testing.
  • Developed, tested and deployed full code package for EDW in SVN& Visual Source Safe and migrated to SIT, UAT and PROD environments.
  • Developed ETL deployment plan for the production environment & provided run books to production support team.
  • Used SAS Library Engines and Remote Library Services to store data in external data structures and on remote computer platforms
  • Extracted data fromvarious source systemslikeOracle and flat files as per the requirements and loadedit to TeradatausingFASTLOAD, TPUMP and MLOAD
  • Worked on InformaticatoolSource Analyzer, Data warehousing designer, Mapping Designer, Mapplets and Transformations.
  • ImplementedSlowlyChanging Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Created Operation Manual documents and developed and updated other technical documentations.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
  • Solid Expertise in using both Connected and Un connected Lookup transformations
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Worked on creating various mapplets in Informatica to provide re-usability.
  • Worked effectively with Business Analysts and Project managers for assigned tasks and deliverable timelines.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
  • Involved in defining the overall strategy for design and standards by creating many checklists for smooth deployments.
  • Scheduling and performance tuning of Informatica mappings.
  • Worked closely with the project team to formulate and implement a flexible system design that meets functional requirements.
  • Created data models diagrams using Visio and Erwin for EDW and datamarts.
  • Collaborated with data modellers, ETL developers in the creating the Data Functional Design documents.
  • Created the Staging mappings for delta detection using the different Transformations (Expression, Lookup, Joiner, Filter, Router, Stored Procedure etc.)
  • Free hand SQL, reports with calculation context and drill down features.
  • Requirements analysis, Design, Code and Design reviews.

Environment: Informatica 9.1 on UNIX, Netezza, Oracle 11g, PL/SQL, TOAD, BTEQ, FASTLOAD, MLOAD, TPUMP, TPT, SAP BODI, SAP BODS, SAS

ETL Developer

Confidential, Chicago, IL

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings
  • Created load scripts using Teradata Fast Load and Mload Utilities.
  • Helped developers and leads with performance tuning of Informatica jobs and Teradata SQL queries by observing the explain plans and suggesting the solutions to overcome.
  • Worked on loading data to Teradata datawarehouse using Mload, Tpump, Fast Export and Fast Load utilities.
  • Worked on writing Teradata Macros, Stored procedures.
  • Loaded data to DB2 UDBdatawarehouse.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
  • Wrote complex SQL queries on Netezza and used them in lookup SQL overrides and Source Qualifier overrides.
  • Extracted data from various sources like Oracle, Netezza and flat files and loaded into the target Netezza database.
  • Solid Expertise in using both Connected and Un connected Lookup transformations
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
  • Involved in defining the overall strategy for design and standards by creating many checklists for smooth deployments.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.
  • Worked on extracting real time data and loading to data warehouse.
  • Lead a team of 10 (5 onsite, 5 offshore) delivering the best solution to the client.
  • Usage of reusability in Informatica, parameterization, workflow design and mapping design.
  • Scheduling and performance tuning of Informatica mappings.
  • Worked closely with the project team to formulate and implement a flexible system design that meets functional requirements.
  • Extensively used ETL to load data from different source systems like Teradata Objects, Oracle, Flat files, etc., into the Staging table and load the data into the Target database.
  • Collaborated with data modellers, ETL developers in the creating the Data Functional Design documents.
  • Created the Staging mappings for delta detection using the different Transformations (Expression, Lookup, Joiner, Filter, Router, Stored Procedure etc.)
  • Writing BTEQ scripts for Teradata DDL & DML statements.
  • Free hand SQL, reports with calculation context and drill down features.
  • Requirements analysis, Design, Code and Design reviews.
  • Created Reports usingBusiness Objects XI R2 - Desktop Intelligence & Web Intelligence.
  • Made changes to the existinguniversesaccording to the requirements by creating new classes and objects
  • Added new tables to the universe and made appropriate connections
  • Involved in project planning, estimation, design and architectural discussion with architecture team for data warehouse.

Environment: Informatica 8.6.1 on UNIX, Netezza, SAP BODI, SAP BODS, Oracle 11g, DB2, Teradata 12.0, BTEQ, Oracle, PL/SQL, TOAD, SAP Business Objects.

DW Developer

Confidential, Dover, NH

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings
  • Responsible for Business Requirement Documents BRD's and converting Functional Requirements into Technical Specifications
  • Worked on loading data to Oracle datawarehouse and designed complex PL/SQL code
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Designed and developed SAP BO DI jobs and workflows to load data from Source systems to ODS and then to Data Mart.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
  • Solid Expertise in using both LKP and LKP EXT functions transformations
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
  • Involved in defining the overall strategy for design and standards by creating many checklists for smooth deployments.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.
  • Worked on extracting real time data and loading to data warehouse.
  • Scheduling and performance tuning of SAP BODS workflows and data flows.
  • Wrote PL/SQL stored procedures, functions, nested tables. Extensive hands on experience in PL/SQL.
  • Worked closely with the project team to formulate and implement a flexible system design that meets functional requirements.
  • Extensively used ETL to load data from different source systems like Teradata Objects, Oracle, Flat files, etc., into the Staging table and load the data into the Target database.
  • Collaborated with data modellers, ETL developers in the creating the Data Functional Design documents.
  • Created the Staging mappings for delta detection using the different Transformations (Expression, Lookup, Joiner, Filter, Router, Stored Procedure etc.)
  • Writing BTEQ scripts for Teradata DDL & DML statements.
  • Free hand SQL, reports with calculation context and drill down features.
  • Project estimation, planning and Tracking.
  • Requirements analysis, Design, Code and Design reviews.
  • Designed and implemented star and snowflake schemas to handle 1:N and M:N relationships.
  • Used CVS for deployment, Autosys and Crontab for scheduling.
  • TOAD - is used extensively for connecting to Oracle databases and running the SQL scripts and PL/SQL jobs.
  • Involved in project planning, estimation, design and architectural discussion with architecture team for data warehouse.

Environment: SAP BODS R3 on UNIX, Oracle 10g, Teradata 12.0, BTEQ and PL/SQL, TOAD.

DW Developer

Confidential, San Jose CA

Responsibilities:

  • INFORMATICA - extraction from Oracle DB, transform and load to Teradata & Oracle DB. Excellent knowledge on work flows, sessions, tasks, transformations, mappings, file lists, performance tuning.
  • Lead a team of 10 (3 onsite, 7 offshore) delivering the best solution to the client.
  • Usage of reusability in Informatica, parameterization, workflow design and mapping design.
  • Scheduling and performance tuning of Informatica mappings.
  • Prepared mapping documents and technical specifications
  • Writing PL/SQL stored procedures, functions, nested tables. Extensive hands on experience in PL/SQL.
  • Worked closely with the project team to formulate and implement a flexible system design that meets functional requirements.
  • Extensively used ETL to load data from different source systems like Teradata Objects, Oracle, Flat files, etc., into the Staging table and load the data into the Target database.
  • Implemented slowly changing dimensions according to the requirements, Partitioned sessions, modified cache/buffers and tuned transformations for better performance.
  • Collaborated with data modellers, ETL developers in the creating the Data Functional Design documents.
  • Created the Staging mappings for delta detection using the different Transformations (Expression, Lookup, Joiner, Filter, Router, Stored Procedure etc.)
  • Writing BTEQ scripts for Teradata DDL & DML statements.
  • Free hand SQL, reports with calculation context and drill down features.
  • Universe with aggregate navigation and derived tables.
  • Scheduled &adhocWebi reports.
  • Dashboards creation using Xcelsius.
  • Project estimation, planning and Tracking.
  • Requirements analysis, Design, Code and Design reviews.
  • Designed and implemented star and snowflake schemas to handle 1:N and M:N relationships.
  • Scheduling and broadcasting of reports.
  • Knowledge of various DW industry tools, so able to suggest a best solution at the right time.
  • Writing PL/SQL functions and procedures, use of cursors, complex queries and ability to guide the team. Excellent communication skills
  • PVCS - Polytron Version Control System, is used extensively for Configuration management purposes to store the versions of SQL scripts, Informatica code, DollarU jobs etc.
  • Quality Center - is used for tracking the defects, maintaining and running the test cases.
  • Extensively used Kintana tool for creating the packages and migrating the code.
  • TOAD - is used extensively for connecting to Oracle databases and running the SQL scripts and PL/SQL jobs.
  • Involved in project planning, estimation, design and architectural discussion with architecture team for data warehouse.

Environment: Informatica 8.6 on UNIX, Business objects XI R2 on Windows XP (DeskI, WebI reports),Oracle 10g, Teradata, BTEQ and PL/SQL, Xcelsius, TOAD.

DW Developer

Confidential, Rochester NY

Responsibilities:

  • Lead a team of 8 (4 onsite, 4 offshore) at client location in Rochester, NY
  • Involved in discussion and participation in designing a new EDW data warehouse with architects.
  • Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.
  • Reviewed source systems and proposed data acquisition strategy.
  • Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
  • Analyzed both the source systems - Legacy and new commerce platform to map the data into data warehouse accordingly.
  • Designed and developed SAP BO DI jobs and workflows to load data from Source systems to ODS and then to Data Mart.
  • Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
  • Excellent knowledge of dimensional, relational, logical & physical data modelling helped clients build efficient data model. A new enterprise data warehouse was built as part of the project.
  • Build reports using SAP BO, Universe, WebI and DeskI
  • Wrote stored procs, T-SQL, complex queries and guide team for technical and domain related issues. Excellent communication skills
  • Scheduling of DWH jobs, scheduling of BO reports.
  • Excellent knowledge on materialized views.
  • Creation of Join conditions, Filters, LOV, cardinality, aliases, contexts, aggregate awareness using business objects designer

Environment: SAP BO DI, Business objects XI R2/R3 on Windows XP (DeskI, WebI reports) Oracle 10g, SQL Server 2005, PL/SQL, T-SQL, Xcelsius.

Datawarehouse Developer

Confidential, Buffalo NY

Responsibilities:

  • One point of contact for handling Order management, procure to pay, Finance areas and was involved in project planning, estimation, coordination with onsite-offshore and clients, smooth delivery.
  • Lead a team of 6 at client’s location.
  • Experience in handling critical projects with demanding client. Experience working on Informatica, Oracle, Teradata&SAP Business objects.
  • Experience in PL/SQL, stored procs, BTEQ, Complex Informatica mappings, Unix shell script.
  • Experience in SAP BODI, DQXI, understanding of data quality concepts
  • Experience in providing lights on support on datawarehouse environment.Excellent analysis and resolution skills.
  • Handled end to end standalone projects independently delivering them efficiently and effectively. Gained confidence and appreciations from client.
  • Experience in working with multilingual people and worked with business users from Canada, Germany, Latin America, UK & Singapore.
  • Knowledge of OLAP, static and adhoc reporting tools.

Environment: Informatica 8.6 on UNIX, Business objects XI R2 on Windows XP (DeskI, WebI reports),Oracle 10g, Teradata, BTEQ and PL/SQL, SAP BO DI.

We'd love your feedback!