We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

0/5 (Submit Your Rating)

WA

SUMMARY

  • Over 7.5 years of Strong Informatica development experience in IT Industry wif A very Stong hold on Data Warehousing Tools & industry Certified Standard methodologies and procedures.
  • About 6.5 years of Information Technology Experience in Data Warehouse using Tools like Informatica PowerCenter 9.1/8.6.1/8.1.1/7. x/6.x and Power Exchange 8.6.1/5.2 Wif Teradata and Oracle Ware House Builder.
  • Expertise in Data Warehousing wif variety of Data manipulating skills Data Migration, Data Modeling, and Data Cleansing and Data Validation.
  • Expertise in OLTP/OLAP System Study, Analysis and E - R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling
  • Experience in Data Modeling using Erwin Data Modeller (Erwin 4.0/3.5.5) & ER Stdio tool for both Physical & logical data modeling.
  • Installed and configured Informatica Server and Informatica Repository Server in Windows-UNIX OS.
  • Proficient in using Informatica Data Explorer IDE, Informatica Data Quality IDQ, and Data Analyst.
  • Experience on all Informatica components like Repository R, Designer D, Workflow Manger W and Workflow Monitor M.
  • Experience in integration of Heterogeneous data sources like Oracle, Teradata, SQL Server 2005/2008, Sybase, DB2, XML Files, Flat Files and MS Access into staging area as well as from homogenous sources.
  • Experienced wif Informatica Power Exchange (8.x/5.x) for Loading / Retrieving data from mainframe systems and other Enterprise level Data wif reduced and cost effective.
  • Worked wif Power Exchange which helps in better access, load, and deliver data as part of teh Extract-Transform-Load (ETL) process to simplify teh development and deployment of smaller departmental data marts and data warehouses as an incremental approach toward enterprise data warehousing.
  • Created complex mappings using Informatica B2B Data Transformation Studio (DT).
  • Extensive database experience in SQL in Oracle, MS SQL Server2005/2008, DB2, Teradata, Mainframe Files, Sybase, Flat Files, MS Access and noledge about XML files.
  • Experience working in Oracle8i/9i/10g/11g wif database objects like triggers, stored procedures, functions, packages, views and indexes.
  • Expertise in working wif Teradata V2R5/V2R6 systems and used utilities like Multi-Load, Fast-Load, Fast-Export, BTEQ, TPUMP, SQL Assistant.
  • Worked wif Global Temporary tables and Volatile tables and created secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Worked wif Oracle Data Integrator (ODI) Works in teh staging/integration layer along wif Pre-built modules for CDC, bulk loading etc.
  • Extensively worked wif Teradata Development team in better integration of data coming from different data sources and improve performance in loading data using Utilities like Multi-Load, Fast-Load, Fast-Export, BTEQ, TPUMP and TPT scripts.
  • Extensive experience in performance tuning, identified and fixed bottlenecks and tuned teh complex Informatica mappings for better performance.
  • Scheduled jobs and automated teh workload using teh Autosys Job Scheduler.
  • Used T-SQL for creating Triggers and stored Procedures using SQL Server 2000/05/08 to create reports.
  • Worked wif IBM Main frames and has good interaction wif different tools in IBM like EOS, Docutext, TPX, and TSO.
  • Experience in using Crystal Reports, Business Objects XI and SAP BO in design and development of Business Intelligence reports.
  • Created complex SQL queries and used in creation of views and functions for generating complex reports.
  • Conducted System, UAT and Functionality testing and investigated software bugs.

TECHNICAL SKILLS

ETL Tool: Informatica Power Center 9.1/8.6.1/8.1/7. x/6.x, Informatica IDQ 9 Power Exchange 8.6/5.2, Informatica B2B DT Studio and Power Mart, OWB and ODI, Data Flux.

Languages: SQL, PL/SQL, T-SQL, C, C++, UNIX Shell Scripting, Perl Scripting, HTML, Java Scripting.

Databases: Oracle 8i/9i/10g/11g, Teradata V2R5/V2R6, R13, MS SQL Server 2000/2005/2008 , Sybase, Flat Files, XML Files, IBM Mainframe Systems and SFDC (salesforce.com).

Data Modeling, Data Architecture: Erwin 3.x/4.x, ER Stdio, Logical Modeling, Physical Modeling, Relational Modeling, ER Diagrams, Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Fact and Dimensions Tables.

Query & Utility Tools: TOAD 8.x, 7.x, SQL * Plus, SQL Loader, Teradata SQL Assistant 7.1, BTEQ, Fast Export, Sybase Advantage, Fast Load, Mload, Teradata TPT Scripts, Oracle SQL developer, SQL * loader.

Reporting Tool: Business Objects XI R3, SAP Business Objects 6.5, Cognos reporting 8,8.4,10

OS & Packages: MS-DOS, UNIX, Linux, Windows NT, MS Windows 98/2000/2003/ XP, MS Office (MS Access, MS Excel, MS PowerPoint, MS Word).

ERP Packages: SAP ERPR/3,SAP ECC6.

SDLC Methodologies: Agile, Waterfall and Scrum

PROFESSIONAL EXPERIENCE

Confidential

ETL Informatica Developer

Responsibilities:

  • Designed and developed mappings using Source Qualifier, Expression, connected-Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1.
  • Serving as teh Administrator for DataFlux(DF)2.2 solutions and any required supporting products for noledge sharing, workflow and demonstration purposes.
  • Coordinated wif teh Informatica administration team during deployments.
  • Working closely wif users/developers and administrators to resolve teh production problems by reviewing design changes.
  • Worked on DataFlux along wif web services which quickly integrate into business processes, applications and websites - providing virtually seamless functionality that improves productivity and efficiency while reducing maintenance and management costs.
  • Providing reliable delivery of targeted project results through role as expert in teh application of specific DF methodologies, projects and technologies.
  • Participating in product and solution training to acquire and maintain a detailed level of product noledge of core components of DF offerings and assigned solution areas, how each solution addresses business challenges, competitive information to identify how our solution stands apart, and what challenges/limitations may be encountered
  • Knowledge of FTP and HIPAA compliant ANSI formatted EDI transactions, and thorough noledge of security in handling PHI secure Health data transactions.
  • Implemented Type 2 Slowly Changing Dimensions to maintain teh historical data in Datamart as per teh business requirements.
  • Proficient in using Informatica Data Explorer IDE 9, Informatica Data Quality IDQ 9 and Data Analyst.
  • Involved in Data migration to a new server by decommissioning teh existing production Servers.
  • Involved in working wif MDM project across teh global enterprise through increased accuracy, reliability, and timeliness of business-critical data.
  • Worked wif Informatica Power Exchange 8.6.1 which works in conjunction wif PowerCenter9.1 to capture changes to data in source tables and replicate those changes to target tables and files.
  • Worked wif Power Exchange for CDC techniques for relational database sources on UNIX, and Windows operating systems.
  • Created SAS Document for Configuration team to execute teh workflows in teh order to move teh data from Dev box to QAT environments for testing purposes.
  • Exclusively worked wif BI/QA teams in troubleshooting defects and escalating them to Architects and High level in case of defects in design and also worked in Data Quality Tickets for improving teh performance in case of slowly running queries.
  • Worked wif existing OWB and ODI data ETL and Data Integrator tools to better accommodate teh data wif teh existing Databases into teh new ones.
  • Regularly involved in Scrum Meetings for one project, and also involved in Agile for other project -SDLC methodologies.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Worked wif Informatica Administrator to move project folders in development, test and production environments.
  • Involved in creation of various Unix Scripts and Perl Scripts which help teh ETL scheduling jobs and help in initial validations of various tasks.
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling teh jobs (work flows)
  • Worked on PL SQL Scripts and turned teh stored Procedures in PL SQL into ETL Informatica Scripts.
  • Involved in Converting and gathering necessary calculation of PL SQL complex calculations into Informatica mappings.
  • Implemented different Tasks in workflows which include Session, Command, E-mail, Event-Wait, Event - Raise, Timer etc.
  • Designed and implemented stored procedures and triggers for automating tasks in SQL Server 2005/2008, Oracle 10g.
  • Worked wif SAP ECC6.0 wif power Exchange 8.6.1 to extract data from SAP ECC6
  • Worked wif functionality upgrade and Technical Upgrade of SAP ECC6
  • Used Quality Stage to check teh data quality of teh source system prior to ETL process.
  • Participated in cross-functional efforts to support other teams - such as ETL and database tuning to support Cognos reporting performance
  • Developed UNIX Batch scripts for automation of execution ofInformaticaworkflows and Perl Scripts for loading of Data from Source to Target.
  • Worked wif Business persons and good experience in collecting teh requirements, looking at teh needs of business suggesting teh best strategies to adopt in development.
  • Involved in Unit testing all teh code developed, and testing all teh test data wif different constraints and conditions which code has to pass and tan allowing teh code move into QAT.
  • Used Workflow Manager for Creating, Validating, Testing and running teh sequential and concurrent sessions and scheduling them to run at specified time wif required frequency.
  • Maintained Versions of SQL Scripts, Documentation using StarTeam/Accurev.
  • Expertise in working wif Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets, Transformation and sessions by identifying and rectifying performance bottlenecks.
  • Experience in using Normalizer transformation for normalizing teh XML source data.
  • Extensively used XML transformation to generate target XML files.
  • Created mappings to load data from Flat Files and relational sources into staging tables, and to Enterprise Data Warehouse by transforming teh data according business Rules and transformations & Populating teh Data Mart wif only required information.
  • Extended Assistance in Production environment during Project deployment phase.

Environment: Informatica Power center 9.1, Oracle 10g, TOAD, Oracle SQL developer, Microsoft SQL Server 2005/2008, DB2, HP Quality center, XML, Flat Files, Windows, Accurev, SAP ECC6.0, HP-UNIX, Autosys, Erwin Data Modeler, Novell GroupWise Mail client, George test data creator.

Confidential, WA

ETL Informatica developer- Teradata DWH

Responsibilities:

  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, TERADATA, SQL Server2005/2008, DB2 and Flatfiles.
  • Applied Transformation logic in various complexities in transforming and transferring teh data into downstream Teradata EDW.
  • Implemented scripts Teradata Utilities like Fast LOAD, MultiLOAD, TPUMP, FastEXPORT, TPT Scripts in scheduling teh loads to EDW Teradata.
  • Worked wif Global Temporary tables and Volatile tables and used secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Involved in Coding PL SQL Scripts for teh Insert, Update, Delete Scripts for teh data movement from Source to Staging and from Staging to Targets.
  • Involved in Data Migration Project to submit all teh data to teh Government for teh ATT deal wif Confidential to Merge teh Company and Acquire it.
  • Extensively worked wif Teradata Development team in better integration of data coming from different data sources and improve performance in loading data using Utilities like BTEQ, Fast-Load, Multi-Load, Fast-Export, TPUMP and TPT scripts.
  • Extensively worked wif staging area to perform data cleansing, Data profiling, data filtering and data standardization process.
  • Designed and developed mappings using Source Qualifier, Expression, connected-Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations.
  • Created SAS Document for Configuration team to execute teh workflows in teh order to move teh data from Dev box to QAT environments for testing purposes.
  • Exclusively worked wif BI/QA teams in troubleshooting defects and escalating them to Architects and High level in case of defects in design and also worked in Data Quality Tickets for improving teh performance in case of slowly running queries.
  • Extensively worked in Agile Methodology in all phases of Project.
  • Worked on Java Scripting in some applications that helps teh main site of teh company.
  • Knowledge of FTP and HIPAA compliant ANSI formatted EDI transactions.
  • Worked wif Business persons and good experience in collecting teh requirements, looking at teh needs of business suggesting teh best strategies to adopt in development.
  • Defined UNIX Batch scripts for automation of execution ofInformaticaworkflows.
  • Expertise in working wif Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.
  • Exclusively worked on changing teh legacy servers and migrating teh code to new servers and changing teh operating systems.
  • Developed SQL overrides in Source Qualifier and Lookup transformations according to business requirements.
  • Understanding, utilizing and communicating best practice methodologies internally and externally.
  • Working proactively wif Systems Engineers, Technical Architects and other Solutions consultants to strategize on opportunities, cross-training and noledge transfer in DataFlux Technologies.
  • Analyzes technical system problems, and designs and implements effective solutions for existing applications in teh Area of DF.
  • Used connected /Unconnected Lookup Transformation to lookup teh values in Flat Files and relational sources and targets for Full load and incremental Load strategies.
  • Implemented Type 2 Slowly Changing Dimensions to maintain teh historical data in sales Datamart as per teh business requirements.
  • Implemented different Tasks in workflows which include Session, Command, E-mail, Event-Wait, Event - Raise, Timer etc.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets, Transformation and sessions by identifying and rectifying performance bottlenecks.
  • Used stored procedures for creating & dropping teh indexes before teh session run to improve teh session performance and rebuild them after completion of session.
  • Used Workflow Manager for Creating, Validating, Testing and running teh sequential and concurrent sessions and scheduling them to run at specified time wif required frequency.
  • Worked in Speech Integrated Voice Recognition Projects (SIVR) in creating different codes for automated answering system, in effectively handling teh Large volume of customers calling teh care and effectively reducing teh Care load and providing teh data through SIVR and hanling teh call information to EDW.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Automation of job processing using Autosys scheduler, establishing automatic email notifications to teh concerned persons by creating email tasks in workflow manager
  • Unit tested all teh mappings and sessions in Development environment and migrated into Production environment after everything went successful in QAT testing.
  • Extended Assistance in Production environment during Project deployment phase.

Environment: Informatica Power center 9.1/8.6/8.1. , Teradata R13, Oracle 10g, Teradata SQL Assistant 7.1, TOAD, Oracle SQL developer, Microsoft SQL Server 2005/2008, DB2, HP Quality center, XML, Flat Files, Windows, Remedy, Accurev, HP-UNIX, Autosys, Erwin, Business objects XI Release 2, SAP BO.

Confidential, Baltimore, MD

Sr.Informatica Developer/Teradata

Responsibilities:

  • Developed complex Informatica mappings wif various transformations
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 8.6.1.
  • Analyzed, Designed and Implemented teh ETL architecture and generated OLAP reports.
  • Worked wif OWB and ODI ETL and Data Integration tools to Pre-built modules for CDC, bulk loading etc.
  • Data Integrity Controls create a data “firewall” and Reduces data prep time by not processing erroneous data
  • Help determine and recommend workflow for raw property file conversion, quality control, data standardization and cleansing along wif DF.
  • Create and maintain various reporting tools that will be necessary to profile teh data for teh Assessment team and teh Data Quality team.
  • Maintain ownership of a plan for parallel processing while awaiting teh delivery of a replacement system.
  • Worked wif Source System Analysts, developers and business owners to better identify data sources for defining data extraction methodologies.
  • Migrating Data from Internal Legacy DEVICC Servers and Legacy Test ICC servers to different High Configuration Servers to improve performance and handle greater loads.
  • Experience in writing PL SQL coding for complex Stored Procedures for some of teh Complex calculation in databases.
  • Involved in Data Migration to take over teh small company which Confidential Acquired for $34M.
  • Experience in working wif MDM by improving teh tracking, transparency, and auditing of financial data
  • Designed teh metadata tables and created mappings to populate teh same. These tables were used to generate teh parameter files.
  • Created mappings to load data from Flat Files and relational sources into staging tables, and to Enterprise Data Warehouse by transforming teh data according business Rules and transformations & Populating teh Data Mart wif only required information.
  • Worked on Java Scripting in some applications that help teh main site of teh company.
  • Used all Transformations such as Expressions, Filters, Joiners, aggregators, Lookups(connected and Unconnected), Update strategy, Sequence Generator, Routers and XML to load consistent data into Database.
  • Involved in writing SQL Stored procedures and Shell Scripts and Perl Scripts to access data from Oracle, and MySQL.
  • Created reusable transformations to clean teh data, which were used in several mappings.
  • Developed and tested stored procedures, functions and packages in PL/SQL.
  • Extracted data from various sources like MS SQL Server 2008, DB2, flat files, Excel spreadsheets, Oracle and XML files and loaded into teh oracle database.
  • Designed and implemented stored procedures and triggers for automating tasks in SQL Server 2005/2008, Oracle 10g.
  • Developed flowcharts for complex stored procedures.
  • Development of an automated process for loading teh data into teh base tables in EDW (for ETL).
  • Created and Executed workflows and Worklets using Workflow Manager to load teh data into teh Oracle Database.
  • Developed custom interface for executing and managing SSIS packages.
  • Maintained Versions of SQL Scripts, Documentation using StarTeam/Accurev.
  • Worked wif Informatica Administrator to transfer project folders in development, test and production environments.
  • Involved in creation of various Unix Scripts and Perl Scripts which help teh ETL scheduling jobs and help in initial validations of various tasks.
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling teh jobs (work flows)
  • Upgraded teh systems to 8.6 from 8.1.1 wif teh help of Configuration team and extended support in resolving teh issues for teh upgradation.
  • Generated ad hoc reports using Business Objects.
  • Used Informatica Version Control for checking in all versions of teh objects used in creating teh mappings, workflows to keep track of teh changes in teh development, test and production environment.

Environment: Informatica Power Center 8.6/8.1.1, Oracle 9i, SQL Server 2008, SSIS, PL/SQL, SQL, DB2, UNIX, Business Objects 5.1, Shell scripts.

Confidential, Charlotte, NC

Informatica Developer

Responsibilities:

  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML. Developed various Ad-hoc mappings for various business needs.
  • Worked wif Teradata team in loading teh data using relational connections and using different utilities to load teh data like BTEQ, Fast Load, Multi Load, Fast Export, TPUMP, TPT scripts.
  • Worked wif Global Temporary Tables, volatile tables and created secondary Index in joining tables to improve performance by joining two t
  • Performed extraction, transformation and loading of data from RDBMS tables, Flat Files, SQL into Teradata in accordance wif requirements and specifications.
  • Understanding business needs and implement teh same into a functional database design.
  • Developed new existing Informatica mappings and workflows and updating teh old ones wif teh additional columns and deleting teh ones which are not useful and moving some columns to other tables based on specifications are developed.
  • Created SQL joins, sub queries, tracing and performance tuning for better running of queries in SQL Server 2005.
  • Created and Configured Dimensions, Attributes and Hierarchies.
  • Developed and tested all teh backend programs, Error Handling Strategies and update processes.
  • Created mappings using teh transformations like Source Qualifier, Aggregator, Expression, Look Up, Router, Filter, Update Strategy, Joiner, Sequence Generators and Stored Procedure.
  • Writing Stored Programs (Procedures & Functions) to do Data Transformations and integrate them wif Informatica programs and teh existing application
  • Experience in using Normalizer transformation for normalizing teh XML source data.
  • Involved in Data Migration Project to change teh existing Server to a new one.
  • Extensively used XML transformation to generate target XML files.
  • Used Quality Stage to check teh data quality of teh source system prior to ETL process.
  • Participated in cross-functional efforts to support other teams - such as ETL and database tuning to support Cognos reporting performance
  • Developed Perl Script for loading of Data from Source to Target.
  • Created, scheduled, and monitored workflow sessions on teh basis of run on demand, run on time, using Informatica Power Center workflow manager.
  • Used Erwin data modeling tool to do conceptual/logical/physical data models.
  • Developed Unix shell scripts to scheduling Scripts by using scheduling tools Autosys.
  • Designed Excel Sheets for each mapping of their Test Scenarios.

Environment: Informatica Power Center 8.6.1, Teradata V2R5/V2R6, SQL Server 2005/2008, T-SQL, Toad, Oracle SQL developer, Quality Center,Windows2008, UNIX, Perl Scripting, Cognos, Shell script, Autosys.

Confidential -Chantilly, VA

Sr.Informatica Developer

Responsibilities:

  • Extensively used Informatica Power center for extracting, transforming and loading data from relational sources and non-relational sources like flat files.
  • Extensively used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy, and Joiner while migrating data from various heterogeneous sources like Oracle, DB2, XML and Flat files to Oracle.
  • Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets.
  • Developed Mappings using Designer to extract, transform data according to teh requirements and loaded into database.
  • Handle slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in teh data warehouse.
  • Designed complex UNIX scripts and automated it to run teh workflows daily, weekly and Monthly.
  • Setting up Batches and sessions to schedule teh loads at required frequency using Power Center Workflow manager.
  • Created sessions, database connections and batches using Informatica Workflow Manager.
  • Involved in Data Validation for a Data Migration Project which is merging
  • Developed mappings to load data in slowly changing dimension.
  • Scheduled Sessions and Batches on teh Informatica Server using Informatica workflow Manager.
  • Created complex PL/SQL stored procedures and functions, Monitoring teh ETL jobs and fixing teh Bugs.
  • Worked along wif teh DBA to resolve performance and tuning issues.
  • Provided reliable, timely support of integration, performance and user acceptance testing processes.
  • Involved in doing Unit Testing, Integration Testing and System Testing for other portions of testing.

Environment: Informatica Power Center 8.5/7.1, Oracle 10g/9i, DB2, XML, Flat files, SQL, PL/SQL, TOAD, SQL*Plus, Windows, UNIX.

Confidential, San Francisco, CA

ETL Developer

Responsibilities:

  • Responsible for developing, support and maintenance for teh ETL processes using Informatica Power Center.
  • Designed Sources to Targets mappings primarily for Excel, Flat files, Teradata to SQL server using Informatica Power Center.
  • Various kinds of Transformations were used to implement simple and complex business logic.
  • Created mappings using Lookup, Aggregator, Router and Joiner transformations for populating target data in efficient manner.
  • Implemented Business Logic using PL/SQL Stored Procedures, Functions, and other Packages.
  • Created reusable transformations, Mapplets and used them in Mappings.
  • Fine tuned Transformations and mappings for better performance.
  • Atomized Worklets and Session schedule using UNIX Shell scripts.
  • Wrote pre-session shell scripts to check session model (enable/disable) before running/scheduling batches and scheduled teh workflows using Autosys scheduler.
  • Troubleshoot problems by checking Sessions and Error Logs.
  • Generated Unit Test Plans and documents.
  • Involved in developing different Dashboards/Reports wif different views using Business objects.
  • Responsible for on-call support during teh month-end OLAP reporting of financials.

Environment: Informatica Power Center 6.1/5.1, ERWIN 3.5, Oracle9i, SQL*PLUS, TERADATA V2R5, UNIX, SQL SERVER 2005, TOAD, Business Objects, Autosys.

Confidential

BI/ETL Tester/DWH QA

Responsibilities:

  • Involved in all aspects of software development life cycle. dis covers designing, coding, testing, implementing, deploying, and continued support of all projects owned by teh team.
  • Interact wif business users as well as other members of IT outside of teh team. Involved in preparation of specifications, development and testing of scripts.
  • Generated DDL Queries for creation of new database objects like tables, views, sequences, functions, synonyms, indexes, triggers, packages, stored procedures, roles and granting privileges.
  • Developed teh PL/SQL codes on teh basis of teh requests for teh change.
  • ImplementedInformaticaPower Center for building Oracle Data Warehouse from different OLTP systems
  • Implementing teh standards of creating Informatica Workflows/Mappings.
  • Implemented standards for naming Conventions, Mapping Documents, Technical Documents, Migration form
  • Extensive experience wif End Users for understanding/analyzing teh requirements.
  • Analysis of teh End user requirements and involved in Modeling teh ETL Schema.
  • Detailed Analysis of teh Data provided by teh respective source systems and filling in teh gaps in teh mapping specs.
  • Experience in designing teh complete End to End Data Flow and designing teh ETL jobs
  • Created Users/Permissions/Folders/Shared Folders/Deployment Groups in 7.1/8.6
  • Worked on Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer
  • Used ETL to extract and load data from different data bases and flat files to Oracle.
  • Creating Complex Mappings/Workflows as per teh Business Logic.
  • Involved in writing lot of Functions, Stored Procedures.
  • Responsible for Data Extraction and Transformation from disparate sources like Oracle, DB2, SAP, SQL Server, flat files, and Loading to Oracle usingInformaticaPower Center.
  • Involved in designing ofInformaticaMappings by translating teh business requirements.
  • Involved in Performance Tuning and optimization ofInformaticamappings and session.

Environment: Oracle 8i, TOAD, SQL, PL/SQL, Software Development Life Cycle.

We'd love your feedback!