We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Herndon, VA

SUMMARY:

  • 14 years of IT experience in Data warehousing and Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems in the Financial, Insurance, Pharmaceutical and Telecom industries.
  • Experienced with all phases of SDLC (System Development Life Cycle) Starting from requirements gathering phase through testing phase. Participated in Requirements. Gathering phase to clearly understand the purpose of developing a Data Ware House Solution. Information about business process, business entities is gathered. Diagrams.
  • And reports including ER diagrams, process flow diagrams are produced.
  • 10 years of development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center 10.1.1/9.6.1/8.6.1 /8.1.1/8.0/7.1/7.0/ 6.2/6.1/5.1.2/5.1.1/4.7 (Workflow Manager, workflow Monitor, Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica PowerMart 6.2/6.1/5.1.2/5.1.1/4.7, Power Connect, PowerPlug, PowerAnalyzer, Power Exchange, Datamart, Autosys.
  • 8 years of strong Data cleansing experience using data cleansing functions like replacechr, replacestr, isdate in informatica.worked with data profiling.
  • Involved schemas in design of Dimensional Models, Star Schemas and Snowflake Schemas. Knowledge of Ralph Kimball and Bill Inman’s approaches to Data Warehousing.
  • Understanding of OLAP, OLTP, Business Intelligence and Data Warehousing concepts with emphasis on ETL and Business Reporting needs.
  • Developed test case’s for business and user requirements to perform System/Integration/Performance testing.
  • Extensively used Mercury Quality Center to load test cases, execute them and log defects found in system testing / UAT.
  • Knowledge of Java and Web services.
  • Good understanding of Dimensional and Data Modeling using Star & snow Flake schemas. Experience with Fact and Dimension Tables.
  • Enhanced the Logical and Physical Data Models of the Enterprise Data Warehouse to in corporate the new requirements, by creating new subject areas.
  • Established the Entities, Attributes and Relationships and created the Logical data model (LDM) and mapped the same to the Physical data model (PDM).
  • 6 Years of database experience using Oracle 11g/10g/9i/8i/7.x,Sybase12.5, MS Access7.0/2000, SQL, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000. Hands on Working knowledge in Oracle and PL/SQL Writing Stored Procedures, Packages, Functions and Triggers. Adept at working with Distributed Data Bases, SQL * Loader.
  • 6 years of UNIX shell scripting experience using shell scripts like sed, awk, grep, WC, more, etc. Worked with crontab command to schedule the jobs and ftp command to get access to a remote server.
  • Kicked off the Informatica workflows by running JCLs from Main frame environment.
  • Experienced with POWEREXCHANGE AND POWERCONNECT to connect and import sources from external systems like Mainframe (DB2, VSAM) or MQ Series, etc.
  • Used PMCD command in non windows environment to communicate with the Informatica server, Optimized/tuned mapping for better performance and efficiency, Performance tuning of SQL Queries, Sources, Targets and sessions.
  • Experienced in providing production support on a 24/7 basis and monitored the execution of Informatica Workflows.
  • Maintained Development, Test and Production mapping migration using Repository Manager. Also used the repository Manager to maintain the metadata, security and reporting. Tuning the Informatica mapping s for optimum performance.
  • Experienced in Managing the Metadata and release management. Participated in creating the ETL Metadata for new sources/targets and importing the base line code from production for enhancements.
  • Worked extensively with the ECMS/ARM process to move the code from individual folders to the Project folder and then deploy the approved code into UAT and Production environments.

TECHNICAL SKILLS:

Data Warehousing: Informatica power Center 10.1.1 / 9.6.1 /8.6.1/8.5/8.1.1/7.1.2 Informatica Power Mart 6.1/5.1/5.0/4.7, Informatica Power Connect, Informatica Power Exchange 8.1.1, OWB and OLAP.

Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star schema Modeling, Snow - Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 7.0/4.5/4.0/3.0, Oracle Designer, VISIO

Business Intelligence: Cognos series 7, Reportnet 1.1, Cognos 8BI, Business Objects 6.0/5.1/5.0 (Web-Intelligence 2.5, Designer 5.0, and Developer Suite & Set Analyzer 2.0), Developer 2000 (Forms 4.5/5.0, Reports 2.5/3.0), MS Access Reports

GUI: Visual Basic 5.0/6.0, Oracle Forms 6.0, Oracle Reports 6.0, VC++, FrontPage 97/98/2000, Visio and TOAD 6.0/5.0.

RDBMS: Oracle 11g/10g/9i/8i/7.0, Sybase 12.5, MS SQL Server 2005/2000/9.0/8.0/7.0, SQL Server DTS, db2, MS Access 7.0/2000, DB2.

Tools: Developer 2000 (Forms 4.5/5.0, Reports 2.5/3.0), Erwin, MS Access Reports, SQL*Loader, SQL Navigator4.4, SQL Developer, TOAD11.5, Flash, Adobe, MQC, Autosys.

Languages: COBOL,C,C++, J2SE 1.2,FORTRAN

Environment: UNIX, MS-Dos, Sun Solaris 2.7, HP-UX, AIX 4.3.3, Windows 9X/2000/NT/XP, Sun-Ultra, Sun-Spark, Sun Classic

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Informatica Developer

Responsibilities:

  • Participated in performance enhancement in Informatica sessions.
  • Previously there was one lookup to create a source dump for 7 Instruments which is performance wise not advisable as it occupies a lot of primary memory. So I have splitted this lookup into 7 lookups to create separate source dump for each individual instruments.
  • Participated in developmental work as a part of enhancements and Defect fixing of the system.
  • Worked with Java transformation to parsing the data with corresponding values.
  • Created the autosys JILs for automation of CCRIP Process.
  • Extensively used the autosys commands for starting, stopping, killing and verifying the status of the jobs throughout the entire project.
  • Worked on the documentation of explaining in detail the purpose of each workflow so that it is helpful to the future Developers.

Environment: Informatica Power Center 9.6.1/10.1.1, Oracle11g, TOAD 11.5, PL/SQL, UNIX, FTP, SFTP.

Confidential, Herndon, VA

Sr. Informatica Developer

Responsibilities:

  • Participated in requirement analysis with the help of business model and functional model.
  • Developed parameter driven ETL process to map source systems (XML Sources) to target data warehouse with Informatica using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookups, Aggregator, Sorter, Sequence generator, etc. and having files as a source and targets along with database objects.
  • Extensively worked with XML sources which are from external source system and pulled the data from multiple groups (views) and joined them as per the requirements.
  • Extensively worked with exceptional data and routed the same to another exception data store for further analysis.
  • Conducted ETL development in the Netezza environment using standard design methodologies.
  • Developed test cases and performed unit testing after conducting peer reviews of all the workflows before moving them to higher environments.
  • Created High Level / Low level Design Documents, Application Control Documents (Supplemental documents) and Interface Control Documents and uploaded them in DOORS.
  • Extensively used the autosys commands for starting, stopping, killing and verifying the status of the jobs throughout the entire project.
  • Experienced working with rational clear case for version controlling and migrating the artifacts to higher environments.
  • Created Run Books and Release notes for the migration of the Code and performed Migration through Clear Case Tool.
  • Tracked the defects using Clear Quest tool and generated defect summary reports.
  • Attended the daily Defect calls and resolved the defects after discussing with the team members.

Environment: Informatica Power Center 9.5.1/9.6.1, Oracle 11g, Netezza 6.x, TOAD 11.5, PL/SQL, UNIX, Rational Clear case, Clear Quest, MDM, FTP, SFTP.

Confidential, New York, NY

Sr. Informatica Developer

Responsibilities:

  • Interacted with business users from difference and operation team for requirement gathering.
  • Preparation of technical specifications from business requirements to develop and implement ETL Processes using Informatica.
  • Loading the data in stage and FACT Tables to generate, validate the reports and sent out for analysis.
  • Documented business requirements, discussed issues to be resolved and translated user input into ETL design documents.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Developed shell scripts for merging files, deleting temporary files, sending the files to a specified Location, etc.
  • Created interfaces using different types of mappings using various transformations like Expression, filter, router, Aggregator, Look up, Joiner, Stored Procedure, Update Strategy, Normalizer Etc.
  • Extensively worked with Mapplets to design a common logic that can be used across several modules.
  • Worked with Synch Processes to detect the changed and new data from the source to insert the new records and to update the old records.
  • Worked with Fusion UI to upload the files which will trigger the whole process from loading the stage table, validating the data elements and then loading the mart with successful source records.
  • Worked with Fusion UI to kick off the events such as Sync Process, BPS Process, etc., through the Event Manager.
  • Verified/Validated the reports again through the Fusion UI and re validated the code in case of any issues.
  • Acted as the metadata coordinator and participated in importing the metadata into the project Folders so that its short cuts can used by the individual developers..
  • Worked with the ECMS/ARM process to move the code from individual folders to the Project folder and then deploy the approved code into UAT and Production environments.
  • Experienced with IRA and SVN tools to export and import the code from one Environment to other.
  • Conducted the unit testing to find and resolve the issues in a single piece of code immediately after the design.
  • Experienced with CONTROL M for automating the workload on a daily, weekly or monthly intervals.
  • Coordinated with the testing team and data owners to resolve the defects as and when they are logged by the testing team.
  • Providing production support and solving the production issues as and when they are noticed.
  • Coordinated with the offshore development team by delegating the tasks and guiding them in building the code and solving the issues.

Environment: Informatica Power Center 9.5, Oracle 11g/10g/9i, MS Access, TOAD 11.5, Rapid SQL, PL/SQL, UNIX, Mercury Quality Center, WindowsNT4.0

Confidential, Thousand Oaks, CA

Sr. Informatica Developer

Responsibilities:

  • Interacted with various business people in External Vendors side and gathered the business requirements and translated them into technical specifications.
  • Documented business requirements, discussed issues to be resolved and translated user input into ETL design documents.
  • Experienced working with team, lead developers, Interfaced with business analysts, coordinated with management and understand the end user experience.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Involved in ETL process from development to testing and production environments.
  • Responsible for creating interfaces using different types of mappings using various transformations like Expression, filter, router, Aggregator, Look up, Joiner, Stored Procedure, Update Strategy, Etc.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters for the delta process to extract only the additional data added during that period.
  • Worked with data cleansing functions like ReplaceStr, ReplaceChr, Isnull, Is Spaces, Is Number, Etc. to clear the data of any foreign characters.
  • Extensively Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Worked with mappings to dynamically generate parameter files used by other mappings.
  • Involved in performance tuning of the ETL process by addressing various performance issues at the extraction and transformation stages.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Documented the mappings used in ETL processes including the Unit testing and Technical document of the mappings for future reference.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Scheduled the Informatica Workflows to start running at specified date and time repetitively for ever.
  • Participated in unit testing to validate the data in the flat files that are generated by the ETL Process.
  • Logged on to the QC Centre to find out the defects posted by testing team and fixed those defects.
  • Used SQL Loader to load the flat file data into temp tables to compare the data with that generated from CMA.
  • Involved in creation of Schema objects like Indexes, Views, and Sequences.
  • Wrote SQL, PL/SQL codes and stored procedures for dropping and re- creating indexes, to generate oracle sequences, procedures that update the Process Run table automatically to make the delta process work accurately.
  • Used the resource MKS Toolkit to successfully execute the UNIX Shell scripts to perform pre/post session tasks.
  • Extensively worked with Unix Shell scripting to validate and verify the data in the flat files generated by the ETL process.
  • Developed post-session and pre-session shell scripts for tasks like merging flat files after loading, deleting temporary files, changing the file name to reflect the file generated date etc.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different servers.
  • Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available.
  • Involved in migration of Informatica mapping from Development to Production environment.
  • Coordinated with the informatica administration team during deployments.
  • Provided the production support and solved the production issues immediately after they are noticed.

Environment: Informatica Power Center 8.6.1/7.1.4, Oracle 11g/10g/9i, MS Access, TOAD 9.0, SQL Developer, PL/SQL, UNIX, Mercury Quality Center, WindowsNT4.0

Confidential, Owings mills, MD

Informatica Developer /Lead ETL Test Analyst

Responsibilities:

  • Interacted with various business people in MVS and Facets side and gathered the business requirements and translated them into technical specifications.
  • Documented business requirements, discussed issues to be resolved and translated user input into ETL design documents.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Experienced in creating XML Definition in power center repository when the source is a XML file.
  • Involved in ETL process from development to testing and production environments.
  • Guided the testing team (three testers) for CVP to perform end to end testing.
  • Developed test cases and mapped them to various business and user requirements.
  • Loaded the test cases in MQC and guided the team with execution of these test cases and logging of defects in MQC.
  • Involved in meetings as a test lead to discuss the test scenarios with the business team and got approval for system testing and UAT.
  • Extracted data from various sources like Oracle and flat files and loaded into the target Oracle database.
  • Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Stored Procedures, Router, Sorter, Rank, Expression, Normalizer and Update Strategy.
  • Developed Informatica mappings to generate FVR (facets voucher record - Claims processed for current date) which contains EOB’s (Explanation of benefits), Notice of Payments (NOP’s) and Check are issued for the current voucher date.
  • Loaded TDS (tactical data store) with the derived voucher fields from MVS (Medical Vouchering System) from the reverse flow flat file.
  • Involved in performance tuning of the ETL process by addressing various performance issues at the extraction and transformation stages.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Target Database.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Developed Session Parameter files for the workflows.
  • Extensively participated in System/Integration/Performance testing.
  • Analyzed the source fields and wrote SQL queries for field to field validation by referring source to target mapping document.
  • Developed test case’s for business and user requirements to identify claims for Institutional, Professional, Subscriber paid, etc. and wrote SQL queries to validate the data on the source and target databases according to source to target mapping document.
  • Involved in regular discussions with the Facets team to enter test data.
  • Provided test data as per the test data requirements provided by the Medical Vouchering System team.
  • Extensively used Mercury Quality Center to load test cases, execute them and log defects found in system testing.
  • Ran JCL’s regularly to kick-off jobs from main frames.
  • Extensively worked with Unix Shell scripting to validate and verify the data in the flat files sent to MVS .Used Korn Shell to do this.
  • Responsible for ETL process under development, test and production environments.
  • Handled Production issues and monitored Informatica workflows in production.

Environment: Informatica Power Center 8.5/8.1, Perl, Informatica Power Exchange 8.1.1, Oracle 11g/10g,db2, MS Access, Facets 4.51, TOAD 9.0, SQL Developer,db2, main frame JCL, PL/SQL, UNIX, Mercury Quality Center, WindowsNT4.0

Confidential, NY

Senior Informatica Developer

Responsibilities:

  • Used power exchange to create a data map for the given dataset name from the mainframe environment.
  • Using informatica ODBC driver this data map, which represents the data, is imported into Informatica repository.
  • Interacted with various departments in the mainframe environment to know about the format of data coming from mainframe environment.
  • Used Informatica Designer to create complex mappings using different transformations to move data to Oracle database.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse using different transformations like Source Qualifier, Expression, Filter, Router, Lookup, Aggregate, Rank, Update Strategy and Joiner.
  • Worked extensively with Normalizer transformation to normalize the data from the Mainframe environment.
  • Developed expressions to convert the dates in Gregorian and Julian formats to standard date format that suites the database.
  • Developed logic to denormalize the data from database to upload it back to mainframe.
  • Extracted (Oracle), Transformed and Loaded data into Flat files using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).
  • Used FTP connection browser available in Informatica Workflow manager to FTP the flat files into the Mainframe environment.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and sent e-mail using server manager.
  • Wrote SQL, PL/SQL codes, stored procedures and packages for dropping and re- creating indexes, to generate oracle sequences, procedures that cleans up the base automatically in case the mapping already ran, the procedure that will logically expire existing records and update specific Code, packages to in corporate business rules to some times eliminates transformation dependency.
  • Responsible for ETL process under development, test and production environments.
  • Handled Production issues and monitored Informatica workflows in production.

Environment: Informatica Power Center 8.5/8.1.1, 7.1.2 (Designer, Workflow Manager), Perl, Power Exchange 8.1.1, Oracle 11g/10g, SQL Navigator 5.0, Windows NT.

Confidential, Washington, DC

Senior DW Informatica Developer

Responsibilities:

  • Used Informatica Designer to create complex mappings using different transformations to move data to a Data Warehouse.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse using different transformations like Source Qualifier, Expression, Lookup, aggregate, Update Strategy and Joiner.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Extracted (Sybase, Oracle, Flat files), Transformed and Loaded data into the staging area and Data Warehouse (Oracle) using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).
  • Designed and developed complex Router, Sequence Generator, Ranking, Aggregator, Joiner, Lookup, Update Strategy transformations to load data identified by dimensions using Informatica ETL (Power Center) tool.
  • Involved in creation and usage of stored procedures and Informatica power mart for data Loads in to data mart on a weekly basis.
  • Read and understood the logic in COBOL in Mainframe environment and transformed that logic and coded into Informatica to load in target tables.
  • Used Informatica Identity Resolution to search and match identical data coming from various sources.
  • Kicked off the Informatica workflows by running JCLs from Main frame environment.
  • Generating shell scripts on UNIX side to move invoices to Accounts Payable in regular intervals.
  • Involved in creating and managing repositories using Repository Manager.
  • Involved in design and creation of Fact tables.
  • Created Data Model from data sources and defined the dimensions, levels and measures in the model.
  • Interacted with various departments, gathering their needs and developing reports based on user requirements.
  • Experienced in generating various complex reports using COGNOS.Worked with Framework Manager to create BI metadata. Experienced in using Query studio, Report Studio in creating ad-hoc and complex reports.
  • Extensively worked with Unix Shell (Korn Shell) scripting to validate and verify the data that is loaded in the flat files.

Environment: Informatica Power Center 8.1.1, Oracle 10g,PL/SQL,SQL Server 2000/2005/8.0/9.0, T-SQL,SQL Server DTS, Sybase12.5, db2, Windows NT, UNIX Shell Scripts, Power Builder 6.0, DBArtisan 7.2.1, Cognos 8.1

Confidential, White Plains, NY

Senior DW Informatica Developer

Responsibilities:

  • The project goal is to get rid of the existing warehouse (STARLINK) sitting on DB2 mainframe and bring it over to new warehouse on ORACLE 10g. The process involved getting data from ODS (Operational data source) to CDC (Change data capture) to the STG (Staging area) and then to the Warehouse base tables before finally populating our Data Mart containing Dimensions and Facts.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations like, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations. Used debugger to test the mapping and fixed the bugs.
  • Worked on 200 TYPE 1 and 200 TYPE 2 mappings including the Fact less Fact table, Simple and Complex Dimensions.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system for Facts and Dimensions
  • Development of the Data marts based on the requirement using Informatica Power Center and also worked on a Snapshots and Shadow tables.
  • Since the major part of the workload was to extract data from one stage to another so Created and monitored workflows and tasks using Informatica Power Center Work flow Manager.
  • Documented the mappings used in ETL processes including the Unit testing and Technical document of the mappings for future reference.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and sent e-mail using server manager.
  • Wrote SQL, PL/SQL codes, stored procedures and packages for dropping and re- creating indexes, to generate oracle sequences, procedures that cleans up the base automatically in case the mapping already ran, the procedure that will logically expire existing records and update specific Code, packages to in corporate business rules to some times eliminates transformation dependency.
  • Generating Data feeds using Informatica mappings from data mart fact and dimension tables for business owners for their analysis and business decisions.
  • Configuring Mapping / Session properties to handle different frequencies of reports; FTP to remote server; mapping parameters values for the start dates, end dates and target file names.
  • Performed tuning of the mappings and workflows to over come complexities involving SQL overrides and AGG transformation properties settings for DATA feeds.
  • Setting up Batches and sessions to schedule the loads at required frequency using Power Center Server manager.
  • Involved in Writing Shell scripts to automate Pre-Session and Post-Sessions Processes.
  • Maintained mapping versions using Informatica Versioning.
  • Generated complex reports using the BI tool Cognos Report net.

Environment: Informatica Power Center 7.1.1, 7.1.2 (Designer, Workflow Manager), Oracle 9i/10g, Toad 8.5 (SQL, PL/SQL,), Windows NT, UNIX Shell Scripts, Cognos Report net1.1

Confidential, Boston, MA

Informatica Developer

Responsibilities:

  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Involved in Dimensional Modeling Techniques to create Dimensions and Fact tables using Erwin.
  • Involved in creating, administering repositories, Folders, Permissions and users using repository manger.
  • Knowledge in Up gradation from 6.x to 7.x
  • Used ETL to extract and load data from Oracle, SQL Server, and flat files to Oracle.
  • Involved in writing lot of Functions, Stored Procedures.
  • Created various transformations such as Update Strategy, Look Up, Joiner, Filter and Router Transformations.
  • Developed mapping to load data in slowly changing dimensions.
  • Performance Tuning of Sessions and Mappings.
  • Created and Monitored Workflows using Workflow Manager and Workflow Monitor
  • Used AUTOSYS to Schedule Informatica Batch Jobs & Scheduled &o automate the update processes
  • Scheduled Informatica Workflows and sessions using Autosys
  • Involved in Unit testing of the mappings.
  • Creating Relational Connections, Migrating the Mappings from Dev to Test and Test to Production
  • Creating Tables, Indexes, Triggers in Different Environments
  • Creating Stored procedures for loading the staging tables

Environment: Informatica Power Center 7.1.1/6.2 (Designer, Workflow Manager), SQL Server, Toad 6.5 (SQL, PL/SQL, SQL* Loader), Erwin 4.0, Windows 2000, UNIX, Shell Scripts, Autosys.

Confidential, Chicago, IL

Informatica Developer / Analyst

Responsibilities:

  • Extensively used Informatica Power center for extracting, transforming and loading databases from Oracle and non-Oracle databases.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules and transformations.
  • Extensively used ETL to load data from different source systems like Oracle, Flat files...etc into the Staging table and load the data into the target database.
  • Worked on Informatica tools Source Analyzer, Warehouse designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Used almost all the transformations like Source qualifier, Expression, aggregator, filter, lookup, stored procedure, sequence generator and update strategy to develop mapping between source system data and data warehouse model.
  • Extensively used Informatica functions like IIF, DECODE, ISNULL, ADD TO DATE, TO DATE, etc in transformations.
  • Worked with mapping variables, Mapping parameters and variable functions like Setvariable, Countvariable, Setminvariable and setmaxvariable.
  • Used Informatica for migrating data from various OLTP Systems, which included identifying various source databases.
  • Developed re-usable transformations, re-usable Mapplets to use them for data load to data warehouse and database (oracle).
  • Involved in scheduling of sessions and workflows.
  • Transferred files from one server to another remote server using Power Channel.
  • Created, launched & scheduled sessions. Configured email notification.

Environment: Informatica Power center 6.2, Oracle 9i, SQL Server, PL/SQL, Erwin 4.0, Windows 2000, UNIX, Shell Scripts.

Confidential, Chicago, IL

Informatica Developer

Responsibilities:

  • Worked with the Business analysts for requirements gathering, business analysis and testing
  • Requirement gatherings with the Internal Users and sending requirements to our Source System Vendor.
  • Dimensional Data modeling after determining the business rules.
  • Imported data from various Sources (Oracle, fixed width flat files) transformed and loaded into Targets
  • Worked on Informatica Power Center 5.1 tool - Source Analyzer, warehouse designer, Mapping Designer, Server Manager, Mapplets, and Reusable Transformations.
  • Worked on Informatica Power center to map the source data into the target star schema.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, testing of Informatica Sessions, and the Target Data.
  • Extensively used Transformations like Router Transformation, Lookup Transformation (connected & unconnected), Source Qualifier Transformation, Update Strategy Transformation, Joiner Transformation, Expression Transformation, Aggregator Transformations and Sequence generator Transformations.
  • Created Pipeline partitioning to improve session performance.
  • Also implemented and maintained reports using Business Objects 5.0.

Environment: Informatica Power Center 5.1, Oracle 8i,PL/SQL, Business Objects, Erwin, Toad, Windows 2000,Unix, Shell Scripts.

Confidential, San Jose, CA

Informatica Developer

Responsibilities:

  • Performed data analysis based on the source systems and existing OLTP database systems.
  • Interacted with business analysts, data architects, application developers to develop a data model
  • Defined attributes on various dimensions like clinical programs, protocol, subjects, site, design, activities, payments etc to suggest a star schema with the core entity as patient’s treatment records.
  • Analyzed data sources to determine source availability.
  • Developed data transfer strategy from various legacy data sources like mainframes, operational and validate data models defined in the ER diagram.
  • Involved in Performance Tuning of sources, targets, mappings and sessions
  • Developed UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
  • Performed unit testing to validate mappings and populate the database.
  • Documentation of Technical specification, business requirements, functional specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.

Environment: Informatica Power Center 5.1, Oracle 9i, SQL, PL/SQL, Unix shell scripts, Windows 2000, Sun Solaris, VSAM, Business Objects 5.0, TOAD, Erwin.

Hire Now