Sr.informatica Lead Developer/ics Developer Resume
SUMMARY:
- Having 11 years of experience in the development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, Tableau, OLAP, BI, Client/Server applications with multiple data bases/Salesforce CRM.
- Expertise in Data Warehouse/Data mart, File feeds, ODS, OLTP and OLAP implementations teamed with project scope, data modeling, ETL development, System testing, Implementation and production support.
- Strong knowledge on ICS, ICRT integration with Salesforce for real - time application solution with oData configuration .
- Hands on experience with Informatica cloud ICS and real time ICRT.
- Configuring Vendor web services with services and processes through ICRT to SOAP and REST best solution
- Experience on Implementation and Integration projects, Salesforce API, REST and SOAP API Integration.
- Basic knowledge in SFDC development using Apex classes, Triggers, Visualforce, Force.com IDE and writing Test Classes for the code coverage
- Involved in all the phases of Informatica cloud services and informatica cloud real-time service integration activities with Salesforce, oracle and REST and SOAP webservices.
- Involved Data Synchronization task, Data Replication Tasks, Mapping Configuration Tasks and Power Center Tasks.
- Handled configuration activities like setup run time environment, connections,schedules,mapplets,saved quires and Fixed Width File Formats
- Designed Mappings, Integration Templates, Bundles and Task flows.
- Configuring Vendor web services with services and processes through ICRT to SOAP and REST best solution
- Involved in environment setup with Informatica cloud, salesforce, web services, oracle and sql server.
- Configuring Vendor web services with services and processes through ICRT to SOAP and REST best solution
- Configuring Vendor web services with services and processes through ICRT to expose it as SOAP and REST best solution.
- Developed informatica cloud services mappings, configuration task and Task flows using Informatica cloud service (ICS).
- Created Custom Profiles, Public Groups and Roles to distribute user rights and functionality.
- Created and customized Record types, page layouts, list views managed Role hierarchies and Profiles.
- Developed Reports, Dashboards and processes to continuously monitor data quality and integrity.
- Configures Chatter for the users in the company for collaboration.
- Used Data Loader for insert, update, and bulk import or export of data from salesforce.com objects
- Lightning Components, Controllers, Helper Methods, and Style Sheets.
- Ability to meet deadlines and handle pressure in coordinating multiple tasks in a work/project environment. Versatile team player with excellent analytical and presentation skills.
- Strong understanding of fundamental business processes, excellent Communication and inter-personal skills with ability to work well in a dynamic team environment
- Provide a single data access point for structured and unstructured data by using data virtualization.
- Experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
- Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2008/2005 and Teradata.
- Excellent skills on Oracle, Netezza, Teradata, SQL Server, DB2 database and Salesforce CRM architecture.
- Worked on lambda architecture for Real-Time Streaming and Batch processing of weblogs.
- Hands on experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, Oracle SQL and Oracle PL/SQL.
- Expert in building Data Integration, Data Visualization, Workflow solutions, and ETL solutions for clustered data warehouse using SQL Server Integration Services (SSIS).
- Installed and Configured Informatica data virtualization on POC Server, including mirroring of power center server.
- Build Splunk dashboards using XML and Advanced XML as well as Created Scheduled Alerts for Application Teams for Real Time Monitoring.
- Experience in Big Data Analysis, set frequent Mining and Association Rule Mining.
- Experience procedures, functions in PL/SQL, troubleshooting and performance tuning of PL/SQL scripts.
- Experience in creating profiles using Informatica Data Quality Developer and analyst tool.
- Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Used PowerCenter Data Virtualization to defines, modifies, and enables reuse of data integration procedures by storing and managing data models, transformations, workflows, and other artifacts.
- Data Base Testing (ETL), Report Testing, Functionality, E2E and Regression.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle11g / Oracle10g, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Worked with Netezza database to implement data cleanup, performance-tuning techniques.
- Maintain data virtualization models in a case tool, such as Erwin, Power Designer or ER Studio.
- Experience in using Automation Scheduling tools like Autosys and Control-M.
- Worked extensively with Broadridge BPS and slowly changing dimensions in EDW environment.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
- Experience in UNIX shell scripting, FTP, Change Management process and EFT file management in various UNIX environments.
- Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member with excellent Verbal and Communication Skills and clear understanding of Business procedures.
TECHNICAL SKILLS:
Databases: Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL SERVER 2000/2005/2008, DB2/UDB, Teradata, SAP Tables and MS Access.
ETL Tools: Informatica Power Center 10.1/9.6.1/9.5/8.6.1 and ICS/ICRT
Reporting tools: Business Objects, Crystal Reports
Data Modeling tools: Erwin 4.0, MS Visio, Informatica Data Virtualization.
Languages/Utilities: SQL, JDBC, PL/SQL, Python, UNIX, Shell scripts, SOAP UI, Perl, Web Services, Java Script, HTML,XML/XSD, Eclipse,C,C#.
IDE/Tools: Putty, Toad, SQL Developer, SQL Loader, HP Quality center
Operating Systems: UNIX (Sun Solaris, LINUX, HP UNIX, AIX ), Windows NT, Windows XP, Windows 7, 8, 10.
Scheduling Tools: Tidal, AutoSys11, UC 4.
Testing Tools: QTP, WinRunner, LoadRunner,, Unit test, System test, Quality Center, Test Director, Clear test, Clear case.
PROFESSIONAL EXPERIENCE:
Confidential
Sr.Informatica lead developer/ICS Developer
Responsibilities:
- Strong knowledge on ICS, ICRT integration with Salesforce for real-time application solution with oData configuration .
- Hands on experience with Informatica cloud ICS and real time ICRT.
- Configuring Vendor web services with services and processes through ICRT to SOAP and REST best solution
- Involved in all the phases of Informatica cloud services and informatica cloud real-time service integration activities with Salesforce, oracle and REST API and SOAP webservices.
- Involved Data Synchronization task, Data Replication Tasks, Mapping Configuration Tasks and Power Center Tasks.
- Interacting with business owners to gather both functional and technical requirements.
- Understanding and reviewing the functional requirements which, we get from different states with the Business Analyst and signing off the requirement document.
- Prepared technical design document as per the functional specification and unit test cases.
- Developed and tested Informatica mappings based on the specification.
- Used various transformations to extract data from different formatted files and relational source system.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance.
- Design, develop, and test Informatica mappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
- Developed reusable transformations and mapplets to offset redundant coding and reduced development time and improving loading performance both at mapping and at session level.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Providing a single environment for data integration and data migration with role-based tools that share common metadata using Informatica data virtualization.
- Performed/automated many ETL related tasks including data cleansing, data conversion, and transformations to load Oracle 10G based Data Warehouse.
- Created tables in staging to reduce code change in mapping to handle dynamic field positions in the source data files and generating flat file.
- Designed and developed strategy for the Workflows and set dependencies between the workflows.
- Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.
- Provide a single data access point for structured and unstructured data by using data virtualization.
- Enable Agile Business Intelligence (BI) with data virtualization.
- User Acceptance, E2E, Multiple Browser, Regression, and Smoke Testing.
- Validated, debugged old Mappings tested Workflows & Sessions and figured out the better technical solutions. Identified the Bottlenecks in old/new Mappings and tuned them for better Performance.
- Parameters Debugging Parameter Issues Matrix Reports and Charts.
- Expert in creating parameterized reports, Drill down, Drill through, Sub reports, linked reports, Snapshot, Cached, Adhoc reports using SSRS.
- Understanding ETL requirement specifications to develop HLD & LLD for type-1, SCD Type-II and Type-III mappings and was involved in testing for various data/reports.
- Extensively worked with performance tuning at mapping levels like implementing active transformation like filter as early as possible in the mapping. Worked extensively with Update Strategy transformation for implementing inserts and updates.
- Creating a data virtualization layer that hides and handles the complexity of accessing underlying data sources.
- Investigating software bugs and reporting to the developers using Quality Center Defect Module.
- Worked on Informatica Data Quality developer modules like key generator, parser, standardizer, address validator match and consolidation.
- Wrote PL/SQL stored procedures to manipulate the data.
- Designed and developed the TSQL logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
- Implemented slowly changing dimension (SCD) Type 1 and Type 2 mappings for changed data capture.
- Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
- UAT testing for HIPAA 4010 and 5010 projects including legacy testing and HIPAA requirements and compliance mandates.
- Extensively worked on Broadridge BPS and Customer Improvement Plan (CIP) items by adding new planning areas to EDW Goal State.
- Created test cases and assisted in UAT testing.
- Reviewed Informatica mappings and system test cases before delivering to Client
- Developer Shell/Perl, MFT scripts to transfer files using FTP, SFTP, and to automate ETL jobs
- Created UNIX shell scripts for archive and purge source files in weblogs.
- Re-designed multiple existing Power Center mappings to implement change requests (CR) representing the updated business logic.
- Migrated Informatica mappings, sessions and workflows from development environment to QA, and checking the developed code into Tortoise SVN for release Exception management.
- Maintained all phases of support documents like operation manual, application flows.
- Documented Data Mappings/ Transformations as per B2B the business requirement.
- Transferred knowledge to outsource team prior to my project completion.
Environment: Informatica Power Center 9.1.0/9.6.1, Salesforce, Oracle 11g/10g RAC, ESP, Putty, Erwin, XML Files, CSV files, SQL, PL/SQL, Informatica Data Virtualization, Linux,C#, Unix Shell scripting, Netezza 7.0.4, Ab Initio Data Profiler, Windows 8/7, SSIS/SSRS, Informatica Cloud Toad3.0, TSQL, Aginity, Cognos, BO BI4.0.
Confidential, VA
Sr.Informatica Developer
Responsibilities:
- Developed ETL programs using Informatica to implement the business requirements.
- Communicate with business customers to discuss the issues and requirements.
- Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Used Informatica file watch events to pole the FTP sites for the external mainframe files.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
- Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Develop the pre-processor and post-processor scripts in PERL for the Mercer Funds files to create the index files for the incoming zip files and to update corresponding DB2 tables during the post ingestion procession, loading (ETL processes).
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Effectively worked on Onsite and Offshore work model.
- Pre and post session assignment variables were used to pass the variable values from one session to other.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
- Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one time scripts to correct them.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
Environment: Informatica 9.6, Slaesforce, SQL Server 2008 R2, HP-UX, DB2, Oracle 11g/10g RAC, ESP, Putty, Erwin, XML Files, CSV files, SQL, PL/SQL
Confidential, Rockville, NJ
Informatica Developer
Responsibilities:
- Interacting with business owners to gather both functional and technical requirements.
- Understanding and reviewing the functional requirements which, we get from different states with the Business Analyst and signing off the requirement document.
- Prepared technical design document as per the functional specification and unit test cases.
- Developed and tested Informatica mappings based on the specification.
- Used various transformations to extract data from different formatted files and relational source system.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance.
- Design, develop, and test Informatica mappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
- Developed reusable transformations and mapplets to offset redundant coding and reduced development time and improving loading performance both at mapping and at session level.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Providing a single environment for data integration and data migration with role-based tools that share common metadata using Informatica data virtualization.
- Used Informatica B2B Data Exchange to Structured data like XML.
- Provide a single data access point for structured and unstructured data by using data virtualization.
- Enable Agile Business Intelligence (BI) with data virtualization.
- Validated, debugged old Mappings tested Workflows & Sessions and figured out the better technical solutions. Identified the Bottlenecks in old/new Mappings and tuned them for better Performance.
- Designed Mappings using B2B Data Transformation Studio.
- Parameters Debugging Parameter Issues Matrix Reports and Charts.
- Expert in creating parameterized reports, Drill down, Drill through, Sub reports, linked reports, Snapshot, Cached, Adhoc reports using SSRS.
- Creating a data virtualization layer that hides and handles the complexity of accessing underlying data sources.
- Investigating software bugs and reporting to the developers using Quality Center Defect Module.
- Worked on Informatica Data Quality developer modules like key generator, parser, standardizer, address validator match and consolidation.
- Wrote PL/SQL stored procedures to manipulate the data.
- Designed and developed the TSQL logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
- Implemented slowly changing dimension (SCD) Type 1 and Type 2 mappings for changed data capture.
- Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
- UAT testing for HIPAA 4010 and 5010 projects including legacy testing and HIPAA requirements and compliance mandates.
- Created test cases and assisted in UAT testing.
- Reviewed Informatica mappings and system test cases before delivering to Client
- Developer Shell/Perl, MFT scripts to transfer files using FTP, SFTP, and to automate ETL jobs
- Created UNIX shell scripts for archive and purge source files in weblogs.
- Documented Data Mappings/ Transformations as per B2B the business requirement.
- Transferred knowledge to outsource team prior to my project completion.
Environment: Informatica Power Center 9.1.0/9.6.1, Oracle 11g/10g RAC, ESP, Putty, Erwin, XML Files, CSV files, SQL, PL/SQL, Informatica Data Virtualization, Linux,C#, Unix Shell scripting, Netezza 7.0.4, Ab Initio Data Profiler, Windows 8/7, SSIS/SSRS, Informatica Cloud Toad3.0, TSQL, Aginity, Cognos, BO BI4.0.
Confidential, Jersey City, NJ
Informatica Developer
Responsibilities: -
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
- Extracted data from Flat files, DB2, SQL and Oracle ODI to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Creating a data virtualization layer that hides and handles the complexity of accessing underlying data sources.
- Extensive experience in creating and maintaining tables, views, indexes, triggers and developing complex stored procedures in Sybase TSQL.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
- Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
- Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
- Used PowerCenter Data Virtualization provides a single, optimized, high-performance environment for both data integration and data federation.
- Created Splunk Dashboards for Business and system performance monitoring.
- Experience in profiling vertica queries and using utilities like DBD, admin tools and workload analyzer.
- Created NZ Load Scripts to load the Flat Files into Netezza Staging tables.
- Configurator of related Tidal scheduler jobs, data conversion and performed unit, load and partner testing of Cisco systems.
- Performed Extensive Data Quality checks using Ab InitioData Profiling Tool.
- Wrote programs in SAS and R to generate reports, creating RTF, HTML listings, tables and reports using SAS/ODS for Ad-Hoc report generation.
- Used Data Virtualization to manages and tracks the complex flow of data across transactional systems, databases, and other data sources through visual tools.
- Used UNIX and shell scripting extensively to enhance the PERL scripts and develop, schedule and support Control M batch jobs to schedule the data migration generation and reporting. The PERL and SHELL scripts invoke the stored procedures for data load, computation and generation of reports.
- Used PowerCenter Data Virtualization leverages Web services protocols, including XML, WSDL, and SOAP, REST to access data, metadata, and data integration workflows within an SOA.
- Architect, Design and develop Analytical dashboards for Cost analysis and ITIL effective management using Tableau.
- Extensively used E2E workflow variables, mapping parameters and mapping variables.
- Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Experience in DW concepts and technologies using Vertica application.
- Develop complex reusable formula reports and reports with advanced features such as conditional formatting, built-in/custom functions usage, multiple grouping reports in Tableau.
- Implemented Informatica recommendations, methodologies and best practices.
- Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
- Involved in Unit, Integration, System, and Performance testing levels.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Informatica Power Center 9.6.1, Oracle 10g, SnapLogic, WinCVS, Informatica Data Virtualization, ERL, SQLServer2008, IBM ISeries (DB2), MS Access,C#,Unix, TSQL, Windows XP, subversion SVN, Ab Initio Data Profiler, Toad, Cognos 8.4.1, SQL developer.
Confidential
Informatica Developer
Responsibilities: -
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from DB2, source flat files and RDBMS tables to target tables.
- Experience in data analysis, data integration, data migration, conceptual data modeling or metadata creation.
- Performed Business Analysis, Data Analysis and Dimensional Data Modeling.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Conducted source-system data analysis to understand current state, quality and availability of existing data.
- Data Quality checking, data validation, cleaning up data to make accurate reports and analysis for the C-suite execs to use in key decision making.
- Worked with Data Stage for Data Extraction, Transformation and Loading (ETL).
- Tested all the Data Stage Parallel jobs when extract, transformation and loading of the data using Data Stage in the parallel processing mode.
- Involved in daily Scrum meetings (Agile Methodology). Also involved in Iteration/Sprint planning meeting to plan the stories that needs to be developed and tested in the upcoming sprint based on the priority and estimated effort.
- Experience in performance tuning on Teradata SQL Queries and Informatica mappings
- Extraction of test data from tables and loading of data into SQL tables.
- Verify the source to target data movement and verify the target data by writing SQL queries.
- Written several complex SQL queries for validating Cognos Reports.
- Worked with business team to system test the reports developed in Cognos.
- Tested whether the reports developed in Cognos are as per company standards.
- Used Quality Center to track and report system defects.
- Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
- Expertise in working in Agile Environment (Scrum), Waterfall.
- Conduct Data analysis including acquisition, cleansing, transformation, modeling, visualization, documentation and presentation of results.
- Tested several stored procedures.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
- Responsible for Data mapping testing by writing complex SQL Queries using WINSQL.
- Experience in creating Python and UNIX scripts for file transfer and file manipulation.
- Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Oracle 11g, 10g, PL/SQL Developer, SQL * Plus, Informatica Power Center 9.x/8.x/7.x, HP Quality Center, TOAD, ESP, Oracle Report, UNIX, Oracle Report Builder, ERWin Data Modeler.
Confidential
ETL Informatica Developer
Responsibilities: -
- Coordinated with the front end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.
- Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
- Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
- Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
- Involved in the continuous enhancements and fixing of production problems.
- Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
- Developed PL/SQL triggers and master tables for automatic creation of primary keys.
- Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
- Created scripts to create DB2 new tables, views, queries for new enhancement in the application using TOAD.
- Gather data from HP BSM database and perform ETL using Stored Procedures of DQ.
- Created indexe s on the tables for faster retrieval of the data to enhance database performance.
- Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
- Used Sybase TSQL with PERL extensively to build the Risk Data Warehouse and store the risk feeds and generate and maintain reports.
- Implementation of new strategy and new product evaluations, deployments and associated processes using tools such as HP Operations Orchestration, ExtraHop, JavaScript.
- Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
- Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
- Designed and Developed the Netezza SQL Scripts as per the Customer requirements.
- Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
- Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
- Helping application teams in on-boarding Splunk Alerts, and Reports etc. Experience on use and understand of complex Reg Ex (regular expressions).
- Troubleshoot / Fix EFT failed jobs for IBM Sterling Connect Enterprise processes, Unix scripts and VMS jobs.
- Perform Backup/Recovery of Vertica DB, and Perform upgrades.
- Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
- Partitioned the fact tables and materialized views to enhance the performance.
- Extensively used bulk collection in PL/SQL objects for improving the performing.
- Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
- Provide administration for Setup new ftp/sftp/ssh accounts for clients.
- Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.
- Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
- Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
Environment: Oracle 11g, SQL * Plus, TSQL, TOAD, SQL*Loader, SQL Developer, ESP, Shell Scripts, UNIX, Windows 8/7.