We provide IT Staff Augmentation Services!

Informatica Developer Resume

4.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Eight plus (8+) years of experience in the development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, Tableau, OLAP, BI, Client/Server applications with Teradata.
  • Extensively worked in ETL and data integration in developing ETL mappings and scripts.
  • Over 4 yearsof programming experience as anOracle PL/SQLDeveloper in Analysis, Design and Implementation of Business Applications using the Oracle Relational Database Management System (RDBMS).
  • Hands on Data Warehousing clustered environment ETL experience of using Informatica 9.6.1/9.5/8.6.1 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, File feeds, ODS, OLTP and OLAP implementations teamed with project scope, data modeling, ETL development, System testing, Implementation and production support.
  • Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Knowledge on data cleansing in Data quality by using the transforms Match transform, Address cleanse and Data cleanse inBODS.
  • Experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2008/2005 and Teradata.
  • Excellent skills on Oracle,Netezza, Teradata, SQL Server, and DB2 database architecture.
  • Worked on lambda architecture for Real-Time Streaming and Batch processing ofweblogs.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, Oracle SQL and Oracle PL/SQL.
  • Expert in building Data Integration, Data Visualization, Workflow solutions, and ETL solutions for clustered data warehouse using SQL Server Integration Services (SSIS).
  • Experience as Business IntelligenceDeveloperusing MicrosoftBIframework (SQL server, SAS and R, SSIS, SSAS, SSRS) in various business domains including Finance, Insurance, and Information Technology.
  • Experience in writing expressions inSSRS and Expert in fine tuning the reports. Created many Drill through and Drill Down reports using SSRS.
  • Experience using Visio and Erwin design tools like Deployment Processes and Adhoc to create Star and Snowflake schemas.
  • BuildSplunkdashboards using XML and Advanced XML as well as Created Scheduled Alerts for Application Teams for Real Time Monitoring.
  • Experience in Big Data Analysis, set frequent Mining and Association Rule Mining.
  • Experience procedures, functions in PL/SQL,troubleshootingand performance tuning of PL/SQL scripts.
  • Experience in Python and UNIX shell scripting for processing large volumes of data from varied sources and loading intoVertica.
  • Experience in creating profiles usingInformaticaDataQualityDeveloper and analyst tool.
  • Partitioned large Tables usingrange partition technique.
  • Experience with Oracle Supplied Packages such asDBMS SQL, DBMS JOBandUTL FILE.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Knowledge on Informatica Data Quality, Informatica Analyst, Informatica MDM tool, Informatica Developer Big Data Edition, InformaticaB2BData Transformation etc.
  • Data Base Testing(ETL), Report Testing, Functionality, E2E and Regression.
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Worked withNetezza harmaceuticaldatabase to implement data cleanup, performance-tuning techniques.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Worked extensively with slowly changing dimensions in EDW environment.
  • Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
  • Experience in UNIX shell scripting, FTP, Change Management process and EFT file management in various UNIX environments.
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member with excellent Verbal and Communication Skills and clear understanding of Business procedures.

TECHNICAL SKILLS

Databases: Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL SERVER 2000/2005/2008 , DB2/UDB, Teradata, SAP Tables and MS Access.

ETL Tools: Informatica Power Center 9.6.1/9.5/8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and SSIS(integration service) Informatica Server), Power Exchange CDC, Ab-Initio 1.8.

Data Modeling tools: Erwin 4.0, MS Visio.

Languages/Utilities: SQL, JDBC, PL/SQL, Python, UNIX, Shell scripts, SOAP UI, Perl,Web Services, Java Script, HTML,XML/XSD, Eclipse,C

IDE/Tools: Putty, Toad, SQL Developer, SQL Loader, HP Quality center

Operating Systems: UNIX (Sun Solaris, LINUX, HP UNIX, AIX), Windows NT, Windows XP, Windows 7, 8, 10.

Scheduling Tools: Tidal, AutoSys11, UC 4.

Testing Tools: QTP, WinRunner, LoadRunner,, Unit test, System test, Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE

Confidential - Minneapolis, MN

Informatica Developer

Responsibilities:

  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Stored Procedures and Normalizer transformations.
  • Created ETL mappings using Informatica Power center to extract the data from multiple sources like Flat files, Oracle, XML, (direct and indirect method) transformed based on business requirements and loaded to Data Warehouse, generated XML.
  • Worked on creating Data base objects such as stored procedures, Views and Tables.
  • Used Oracle sources and generated XML using XML generator transformation.
  • Perform technical reviews of code and test plans created by team members.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified the Bottlenecks, removed the Bottlenecks and implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Providing a single environment for data integration and data migration with role-based tools that share common metadata using Informatica data virtualization.
  • Promoting ETL mappings from Dev to QA, and support user testing process.
  • Monitoring scheduled QA and Production ETL jobs in CA Work Load Automation.
  • Work with support team to define methods for and potentially implement solutions for performance measuring and monitoring of all data movement technologies.
  • Regularly interact with Business Intelligence leadership on project work status, priority setting and resource allocations.
  • Provide Data Integration Support by integrating data from all the different systems using theLavastormBusiness Rule Editor (LavastormBRE) and load the data in DWH tables
  • Implemented slowly changing dimension (SCD) Type 1 and Type 2 mappings for changed data capture.
  • Perform Data Integration activity, Data Quality Management and Data Analysis activities usingLavastormBRE Analytical Tool.
  • Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
  • Customized Unix scripts as required for preprocessing steps and validate input and output data elements along withDataStageroutines.
  • Created test cases and assisted in UAT testing.
  • Reviewed Informatica mappings and system test cases before delivering to Client
  • Developed reusable transformations and mapplets to offset redundant coding and reduced development time and improving loading performance both at mapping and at session level.
  • Participate in deployment, MVC, system testing, UAT.
  • Created POC for the migration ofLavastormBRE graphs to Informatica PowerCenter 9.6.1.
  • Establish and ensure appropriate data quality and ETL metrics are defined, monitored and managed.
  • Set strategy and oversee design and development of EDW staging areas and target tables.
  • Work with support team to define methods for and potentially implement solutions for performance measuring and monitoring of all data movement technologies.

Environment: Informatica Power Center 9.6.1, CA Work Load Automation Tool, RDBMS (Oracle, SQL Server 2016), Flat file, XML, Lotus Notes, Service now, DBCM, Windows 8/7, XML Files, CSV files.

Confidential, New Orleans, LA

ETL (Informatica) Developer

Responsibilities:

  • Interacting with business owners to gather both functional and technical requirements.
  • Understanding and reviewing the functional requirements which, we get from different states with the Business Analyst and signing off the requirement document.
  • Prepared technical design document as per the functional specification and unit test cases.
  • Developed and tested Informatica mappings based on the specification.
  • Used various transformations to extract data from different formatted files and relational source system.
  • Design and developPL/SQLpackages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance.
  • Design, develop, and testInformaticamappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
  • Developed reusable transformations and mapplets to offset redundant coding and reduced development time and improving loading performance both at mapping and at session level.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Broad experience in use ofInformaticaData Quality (IDQ) for Initial Data profiling and matching, removing duplicate data.
  • UsedInformaticaB2BData Exchange to Structured data like XML.
  • Performed/automated many ETL related tasks including data cleansing, conversion, and transformations to load Oracle 10G based Data Warehouse.
  • Analysed the data based on requirements and wrote down techno functional documents and developed complex mappings usingInformaticadata quality (IDQ).
  • Created tables in staging to reduce code change in mapping to handle dynamic field positions in the source data files and generating flat file.
  • Designed and developed strategy for the Workflows and set dependencies between the workflows.
  • Extensive work in SSRS, SSAS, SSIS, MS SQL Server, SQL Programming and MS Access.
  • Worked extensively on Erwin and ER Studio in several projects in bothOLAP and OLTPapplications.
  • User Acceptance, E2E, Multiple Browser, Regression, and Smoke Testing.
  • Validated, debugged old Mappings tested Workflows & Sessions and figured out the better technical solutions. Identified the Bottlenecks in old/new Mappings and tuned them for better Performance.
  • Designed Mappings using B2B Data Transformation Studio.
  • Parameters Debugging Parameter Issues Matrix Reports and Charts.
  • Expert in creating parameterized reports, Drill down, Drill through, Sub reports, linked reports, Snapshot, Cached, Adhoc reports using SSRS.
  • Understanding ETL requirement specifications to develop HLD & LLD for type-1, SCD Type-II and Type-III mappings and was involved in testing for various data/reports.
  • Extensively worked with performance tuning at mapping levels like implementing active transformation like filter as early as possible in the mapping. Worked extensively with Update Strategy transformation for implementing inserts and updates.
  • Investigating software bugs and reporting to the developers using Quality Center Defect Module.
  • Worked onInformaticaDataQualitydeveloper modules like key generator, parser, standardizer, address validator match and consolidation.
  • Wrote PL/SQL stored procedures to manipulate the data.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Implemented slowly changing dimension (SCD) Type 1 and Type 2 mappings for changed data capture.
  • Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
  • UAT testing for HIPAA 4010 and 5010 projects including legacy testing and HIPAA requirements and compliance mandates.
  • Extensively worked on Customer Improvement Plan (CIP) items by adding new planning areas to EDWGoal State.
  • Created test cases and assisted in UAT testing.
  • Reviewed Informatica mappings and system test cases before delivering to Client
  • DeveloperShell/Perl,MFT scripts to transfer files using FTP, SFTP, and to automate ETL jobs
  • Created UNIX shell scripts for archive and purge source files in weblogs.
  • Re-designed multiple existing Power Center mappings to implement change requests (CR) representing the updated business logic.
  • Migrated Informatica mappings, sessions and workflows from development environment to QA, and checking the developed code into Tortoise SVN for release Exception management.
  • Maintained all phases of support documents like operation manual, application flows.
  • Documented Data Mappings/ Transformations as per B2B the business requirement.
  • Transferred knowledge to outsource team prior to my project completion.

Environment: Informatica Power Center 9.1.0/9.6.1 , Oracle 11g/10g RAC, ESP, Putty,Erwin, XML Files, CSV files, SQL, PL/SQL, Linux, Unix Shell scripting, Netezza, Ab Initio DataProfiler, Windows 7, SSIS/SSRS, Informatica Cloud Toad3.0, Aginity, Cognos, BO BI4.0.

Confidential, Carrollton, TX

ETL Informatica Developer

Responsibilities:

  • Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
  • Designed ETL specification documents for all the projects.
  • Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
  • IntegratedInformaticaData Quality (IDQ) withInformaticaPowerCenter and Created POC data quality mappings inInformaticaData Quality tool and imported them intoInformaticaPowercenter as Mappings, Mapplets.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • CreatedSplunkDashboards for Business and system performance monitoring.
  • Experience in profilingverticaqueries and using utilities like DBD, admin tools and workload analyzer.
  • Created NZ Load Scripts to load the Flat Files intoNetezzaStaging tables.
  • Configurator of relatedTidalschedulerjobs and performed unit, load and partner testing of Cisco systems.
  • Performed Extensive Data Quality checks using Ab InitioData Profiling Tool.
  • Wrote programs in SAS and R to generate reports, creating RTF, HTML listings, tables and reports using SAS/ODS for Ad-Hoc report generation.
  • Used UNIX and shell scripting extensively to enhance the PERL scripts and develop, schedule and support Control M batch jobs to schedule the data generation and reporting. The PERL and SHELL scripts invoke the stored procedures for data load, computation and generation of reports.
  • Architect, Design and develop Analytical dashboards for Cost analysis and ITIL effective management usingTableau.
  • Extensively used E2E workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Experience in DW concepts and technologies usingVerticaapplication.
  • Develop complex reusable formula reports and reports with advanced features such as conditional formatting, built-in/custom functions usage, multiple grouping reports in Tableau.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.

Environment: Informatica Power Center 8.6.1, Oracle 10g, ERL, Alteryx, Axon, SQLServer2008, IBM ISeries (DB2), MS Access, Unix, Windows XP, No-Sql, subversion SVN, Ab Initio DataProfiler, Toad, Cognos 8.4.1., SQL developer.

Confidential, Detroit, Michigan

ETL Tester

Responsibilities:

  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from DB2, source flat files and RDBMS tables to target tables.
  • Experience in data analysis, data integration, conceptual data modeling or metadata creation.
  • Performed Business Analysis, Data Analysis and Dimensional Data Modeling.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Conducted source-system data analysis to understand current state, quality and availability of existing data.
  • DataQuality checking,datavalidation, cleaning updatato make accurate reports and analysis for the C-suite execs to use in key decision making.
  • Worked with Data Stage for Data Extraction, Transformation and Loading (ETL).
  • Tested all the Data Stage Parallel jobs when extract, transformation and loading of the data using Data Stage in the parallel processing mode.
  • Involved in daily Scrum meetings (AgileMethodology). Also involved in Iteration/Sprint planning meeting to plan the stories that needs to be developed and tested in the upcoming sprint based on the priority and estimated effort.
  • Experience in performance tuning onTeradataSQL Queries andInformaticamappings
  • Extraction of test data from tables and loading of data intoSQLtables.
  • Verify the source to target data movement and verify the target data by writingSQLqueries.
  • Written several complexSQLqueries for validating Cognos Reports.
  • Worked with business team to system test the reports developed in Cognos.
  • Tested whether the reports developed in Cognos are as per company standards.
  • Used Quality Center to track and report system defects.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Expertise in working inAgileEnvironment (Scrum), Waterfall.
  • ConductDataanalysis including acquisition, cleansing, transformation, modeling, visualization, documentation and presentation of results.
  • Tested several stored procedures.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
  • Responsible for Data mapping testing by writing complexSQLQueries using WINSQL.
  • Experience in creating Python and UNIX scripts for file transfer and file manipulation.
  • Wrote theSQLqueries on data staging tables and data warehouse tables to validate the data results.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Oracle 11g, 10g, PL/SQL Developer, SQL * Plus,InformaticaData Quality (IDQ), Informatica Power Center 9.x/8.x/7.x, HP Quality Center, No-Sql, TOAD, ESP, Oracle Report, UNIX, Oracle Report Builder, ERWin Data Modeler.

Confidential, Dallas, TX

ETL and PLSQL Developer

Responsibilities:

  • Coordinated with the front end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.
  • Worked onSQL*Loaderto load data from flat files obtained from various facilities every day.
  • Created and modified severalUNIX shell Scriptsaccording to the changing needs of the project and client requirements.
  • Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
  • Involved in the continuous enhancements and fixing of production problems.
  • Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
  • DevelopedPL/SQL triggersandmaster tablesfor automatic creation of primary keys.
  • Created PL/SQLstored procedures, functions and packagesfor moving the data from staging area to data mart.
  • Created scripts to createDB2 new tables, views, queriesfor new enhancement in the application using TOAD.
  • Gather data fromHPBSMdatabase and performETLusing Stored Procedures of IDQ.
  • Createdindexes on the tables for faster retrieval of the data to enhance databaseperformance.
  • Involved in data loading usingPL/SQLandSQL*Loadercalling UNIX scripts to download and manipulate files.
  • Used SybaseTSQLwith PERL extensively to build the Risk Data Warehouse and store the risk feeds and generate and maintain reports.
  • Implementation of new strategy and new product evaluations, deployments and associated processes using tools such as HP Operations Orchestration,ExtraHop, JavaScript.
  • PerformedSQL and PL/SQL tuningand Application tuning using various tools likeEXPLAIN PLAN, SQL*TRACE, TKPROFandAUTOTRACE.
  • Extensively involved in usinghints to direct the optimizerto choose an optimumquery execution plan.
  • Designed and Developed theNetezzaSQL Scripts as per the Customer requirements.
  • UsedBulk Collectionsforbetter performanceand easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • CreatedPL/SQL scriptsto extract the data from the operational database into simple flat text files usingUTL FILE package.
  • Helping application teams in on-boardingSplunkAlerts, and Reports etc. Experience on use and understand of complex Reg Ex (regular expressions).
  • Troubleshoot / FixEFTfailed jobs for IBM Sterling Connect Enterprise processes, unix scripts and VMS jobs.
  • Perform Backup/Recovery ofVerticaDB, and Perform upgrades.
  • Creation of database objects liketables, views, materialized views, procedures and packagesusing oracle tools like Toad, PL/SQL DeveloperandSQL* plus.
  • Partitionedthe fact tables andmaterialized viewsto enhance the performance.
  • Extensively usedbulk collectionin PL/SQL objects for improving the performing.
  • Createdrecords, tables, collections(nested tables and arrays) for improving Query performance by reducingcontext switching.
  • Provide administration for Setup new ftp/sftp/ssh accounts for clients.
  • UsedPragma Autonomous Transactionto avoid mutating problem in database trigger.
  • Extensively used the advanced features of PL/SQL likeRecords, Tables, Object typesandDynamic SQL.
  • Handled errors usingException Handlingextensively for the ease of debugging and displaying the error messages in the application.

Environment: Oracle 11g, SQL * Plus, TSQL, TOAD, SQL*Loader, SQL Developer,InformaticaData Quality (IDQ), ESP, Shell Scripts, UNIX, Windows XP

Confidential

Oracle/PlSql Developer

Responsibilities:

  • Developed AdvancePL/SQL packages, procedures, triggers, functions, IndexesandCollectionsto implement business logic usingSQLNavigator. Generated server sidePL/SQL scriptsfordata manipulationand validation and materialized views for remote instances.
  • Created management analysis reporting usingIDQ Parallel Queries, Java stored procedure, HTPpackageand WEB HOW DOCUMENT. Participated in change and code reviews to understand the testing needs of the change components. Worked on troubleshooting defects in timely manner. procedures, functions in PL/SQL,troubleshootingand performance tuning of PL/SQL scripts.
  • Involved in creatingUNIX shell Scripting. Defragmentation of tables, partitioning, compressing and indexes for improved performance and efficiency. Involved in table redesigning with implementation of Partitions Table and Partition Indexes to makeDatabaseFaster and easier to maintain.
  • Experience inDatabase Application Development, Query Optimization, Performance TuningandDBAsolutions and implementation experience incomplete System Development LifeCycle.
  • UsedSQL Server SSIS toolto IDQ build high performance data integration solutions includingextraction, transformation andload packagesfordataware housing.Extracted data from theXMLfile and loaded it into thedatabase.
  • Implemented server and services designs using UNIX, Linux, Windows, Sun, VMware ESX, Java, and Oracle standards infrastructure namely Oracle Application server, Web Logic, OID, and OAM followed exception process when alternative infrastructure was required. Provided Oracle, MS SQL Server, Netezza,BigdataHadoop solution to customers.
  • Designed and developedOracle forms & reportsgenerating up to 60 reports.
  • Performed modifications on existingformas per change request and maintained it.
  • UsedCrystal Reportsto track logins, mouse overs, click-through, session durations and demographical comparisons with SQL database of customer information.
  • Worked onSQL*Loaderto load data from flat files obtained from various facilities every day. Used standard packages like DB2, UTL FILE, DMBS SQL, and PL/SQLCollections and usedBULKBinding involved in writing database procedures, functions and packages for Front End Module.
  • Used principles of Normalization to improve the performance. Involved inETLcode usingPL/SQLin order to meet requirements for Extract, transformation, cleansing and loading of data from source to targetdata structures.
  • Involved in the continuous enhancements and fixing of production problems.
  • Designed, implementedandtuned interfaces and batch jobs usingPL/SQL. Involved in data replication and high availability design scenarios withOracle Streams. DevelopedUNIX Shell scriptsto automate repetitivedatabase processes.

Environment: VB 6, JAVA, Oracle 10g/11g, PL/SQL, SQL*LOADER, Oracle Streams 10g (Replication), SQL PLUS, HTML, SQL Server SSIS, TOAD, XML, HP-UNIX shell scripting.

Confidential

Jr. PL/SQL DEVELOPER

Responsibilities:

  • Responsible for the modification/enhancements/maintenance in the existing PLSQL packages, procedures and functions to establish standards and the deployment on production environment Using UNIX environment for performing the testing.
  • Developed application and database triggers, functions, procedures, and packages in PL/SQL, Orecle8i as back-end.
  • Involved in Creating the Oracle table spaces, tables, indexes, constraints, triggers, synonyms, database links, roles etc.
  • Registered various applications, reports, PL/SQL packages, Request Sets, Forms, Menus and Tables.
  • Experience on Logical & Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture and extensive use of Erwin and MS VISIO as a modeling tool.
  • Understanding of Star Schema and Snowflake Schema.
  • Customization of reports and forms using 2000.
  • Prepared the necessary documentation and design specifications of project.
  • Extensively worked on PL/SQL code in creating various versions for different databases and created new code for the new clients as per the requirement.
  • Extensive system testing done after migration to ensure the accuracy & timely processing of data.
  • Extensively created sub-queries and joins in stored procedures and functions.
  • Worked on table partitioning and index creation to improve the performance.
  • Worked with Oracle development team to fulfill their design requirements.

Environment: Oracle 9i/10g, SQL, PL/SQL, TOAD, Developer 2000 (Forms 10g, Reports 10g), TOAD, XML, HTML, UNIX, and Windows XP/2000/NT, SQL*LOADER.

We'd love your feedback!