- Over 10 years of total experience as Database developer in analysis, design, development, and testing of business applications such as OLTP, OLAP, Enterprise, web based, Client/Server, SOA architectures.
- Extensive experience in Oracle PL/SQL, PostgreSQL pl/pgsql, Microsoft T - SQL.
- Good domain knowledge in Energy, Retail, Manufacturing, Engineering and Insurance sectors.
- Experience in AWS Cloud Computing Services such as EC2, S3, EBS, VPC, ELB, Route53, Cloud Watch, Security Groups, IAM, RDS.
- Worked on AWS IAM and Security Groups in Public and Private Subnets in VPC.
- Involved in logical/physical data modeling using ERWIN.
- Experience in source systems analysis and data extraction from various platforms.
- Extensive experience developing External tables and using Oracle utility tools like SQL Loader, Import and Export on UNIX platforms.
- Experience in developing and managing UNIX shells scripts and scheduling the jobs using Autosys, Control -M and monitor the batch processing.
- Strong understanding of Data Warehousing (Including Star Schema, Snowflake Schema) and Extraction Transformation Loading (ETL).
- Highly proficient in developing and understanding complex stored procedures and functions in PL/SQL, PL/pgSQL and T-SQL.
- Expertise in building schemas, managing db objects (tables, views, materialized views, indexes, triggers, scheduler jobs), partitioning tables, Hints, SQL tuning, troubleshooting, debugging and fixing the codes/scripts.
- Implemented various Error/Exception handling methods using system and user defined exceptions.
- Experience working on AWS RDS/Aurora for Oracle/PostgreSQL databases.
- Proficient in Python to extract/load data from the SQL source database.
- Knowledge on Core Java concepts like multithreading, Exception handling and Java collections.
- Proficient in technologies like XML, HTML, CSS and JSON.
- Implemented projects using various SDLC methodologies like Agile, Water Fall.
- Excellent experience in documenting technical and functional design specifications.
- Experience in Unit testing and Integration Testing with excellent debugging skills.
- Proficient in UML methodologies in preparing use case, sequence, class and interaction diagrams.
Databases: PostgreSQL 11.5, 10.12,9.5, Oracle 12c, 11g, 10g, 9i, SQL Server 2014,2012,2005,2004
Database Tools: pgAdmin, DBeaver, TOAD, SQL Developer, SQL PLUS, SQL Loader
Software’s: MS Visio, MS Office, Denodo, Eclipse, JIRA, WinSCP, Super PuTTY, AWS Conversion Tool.
Scripting: HTML, CSS, XML, UNIX Shell Scripting, DTD
Operating System: Windows: Win7/10/XP/98/95/NT/ Server 2008/2003, UNIX, LINUX
Programming Languages: PL/SQL and SQL, PostgreSQL and pl/pgsql, T-SQL, Java, Python 3.6/2.8, SQL Alchemy
Version Control Tools: TFS (2013), CVS, Clear Quest, Clear Case, Subversion (SVN), Perforce, GIT
Cloud Platforms: Amazon Web Services (AWS), Oracle Cloud Infrastructure (OCI)
Senior Postgres Architect/Developer
- Created Postgres Tables, Views, Procedures, Functions, Triggers based on the requirements.
- Manual conversion of numerous Oracle objects to PostgreSQL objects.
- Developed, debugged and optimized PostgreSQL queries for reporting and app development.
- Worked on multiple joins like Cross join, Left Outer Join, Right Outer Join, Inner Join between two or more tables.
- Created, inserted and extracted Json data from columns in Postgres databases.
- Build servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto- scaling, load balancers (ELBs) in the defined virtual private connection.
- Troubleshooted the performance related Issues on Postgres Instance using Performance Insight and Query plan Management (QPM).
- Created pl/pgsql functions to return tables, set of refcursors and texts to Java application.
- Created Row security policies on Tables to ensure data security while inserting, deleting and selecting using data modification commands.
- Created Users, User Attributes, Privileges, Groups, Roles, Schemas and worked on setting search paths to Schemas.
- Inserted data into multiple tables using COPY.
- Created, inserted and extracted data from multi-dimensional arrays and composite types.
- Created temporary tables in pl/pgsql programs to share data with Java programs.
- Created Procedures/Functions to execute arbitrary SQL statements (Dynamic SQL) without a result set/with input parameters/result set.
- WITH clause (Common Table Expressions) is used for auxiliary/recursive statements in larger queries.
- Transaction control is implemented through PL procedures using Commit/Rollback.
- Worked on aggregate functions, window functions like Rank (), Sum (), Avg () and pivoted data.
- Handled exceptional conditions and warnings in pl/pgsql objects, information is extracted using SQLSTATE and SQLCODE.
- Used EXPLAIN and EXPLAIN ANALYZE to identify and fine tune SQL queries for performance improvements.
- Experience working with high availability, disaster recovery and server tuning strategies including parameters, resources, contention, etc.
- Worked on DB maintenance activities like Vacuum, Re-index and Analyze of DB objects.
- Created Oracle packages as schemas in Postgres and created package objects as schema objects.
- Worked on AWS Script Conversion Tool (SCT) in converting Oracle schema to PostgreSQL schema.
- Worked on data migration (full load and change data capture (CDC)) to AWS Amazon Aurora/ RDS with PostgreSQL compatibility using Data Migration Services (DMS).
- Experience in working with Bitbucket in managing repositories using Git revision control systems. Used JIRA to track bug issues and change management.
- Experience in resolving production issues in minimal time and provided optimal solutions.
- Created Functional Design Documents, Technical Design Documents and Unit Testing Documents.
- Unit tested the mappings by running SQL queries and comparing the data in source and target databases.
Environment: Oracle 12c, Windows 7, TOAD, SQL PLUS, Postgres 11.5, DBeaver 5.3.2, WinSCP-5.13.5, SuperPuTTY 184.108.40.206, SourceTree 220.127.116.11, Bitbucket, Confluence, AWS Schema Conversion Tool
Senior Oracle/PostgreSQL Developer
- Co-ordinate with management on deliverables, impediments and pro-actively initiating dialog as needed.
- Work with support interfaces, 3rd party and infra teams to address or design best in class solutions for system enhancements, issues and defects.
- Ensure consistency in the quality of updates (beyond technical work) and messages going out from the WMS IT team.
- Created high-level technical design and data model as per the business requirements.
- Developed triggers and master tables for automatic creation of primary keys.
- Built system interfaces that send data from WMS operational to data warehouse and data mart using packages for legacy and Golden Gate for data replication to reporting database.
- Created DDL scripts and PL/SQL packages for enhancement and fixes.
- Followed best practices working in Agile Scrum Methodology.
- Created indexes on the tables for faster retrieval of the data to enhance database performance.
- Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE and AUTOTRACE, AWR reports and SQL Profiler.
- Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
- Scheduled jobs using Scheduler (DBMS Scheduler) to create run and manage jobs.
- Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
- Partitioned the fact tables to enhance the performance.
- Extensively used bulk collection in PL/SQL objects for improving the performing.
- Experience in connecting to Postgre SQL database from a java program for performing CRUD operations.
- Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling and RDS in Cloud Formation JSON templates.
- Created python scripts for migration of data from Oracle database to Postgres database.
- Created pl/pgsql functions to return data through refcursor in Postgres database.
- Overloading of functions is performed in Postgres DB based on different number of input parameters.
- Prepared reports using MS Excel (Line/Pie/Bar charts, Pivot Tables, Vlookup, Solver)
- Create XML messages and DTD’s for BizTalk to transfer data between interfaces.
- Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.
- Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
- Extensively worked on SQL joins (inner join, outer join, Self-join), views, packages, procedures, functions.
- Implemented T-SQL concepts like table row constructor, table functions, output clause, top, Tran, exception handling, error functions etc.
- Created Database archival process to purge the data as per the data retention period agreed by users.
- Extensively coordinated with DBA team to execute the fixes on Production database, fix the disk usage issues, plan the database design, to start/stop the database.
- Maintain Development, Test and Production mapping migration Using TFS, also used TFS to maintain the metadata, Security and Reporting.
- Ensure change tickets are in place well in advance of deployment and well documented.
- Ensure code versioning and backups are maintained and coordinated with production environments.
- Ensure SLAs are met as per agreed upon timelines with users.
Environment: Oracle 12c, Postgres 9.x,10.x, AWS,Windows 7, pgAdmin, Oracle SQL Developer 3.2, SharePoint, Team Foundation Server (TFS), Message Queue (MQ), SQL PLUS, Postgres 10.10, SQL SERVER 2012/2014, SSMS
Senior Database Developer
- Drilled deep into the existing data model, understood the business requirements, data relationship and data flows to gain insight of the business model and suggest better ways to implement newer functionality.
- Involved with architects and business analysts to understand business requirement to reflect them into data model and involved with project development teams to convert the model into technical specifications and provide knowledge to quality assurance analysts regarding the model.
- Automated executing script files through UNIX by scheduling the jobs using Autosys.
- Created various SQL Scripts involving Insert, Sequences, Constraint and Index scripts.
- Created and Modified SQL Triggers, Procedures, Functions, packages and bulk collections.
- Developed ER Diagrams, Data flow diagrams based on the requirement.
- Involved in logical and physical database design, Identified Fact Tables, Transaction Tables.
- Proactively tuned SQL queries and performed refinement of the database design leading to significant improvement of system response time and efficiency.
- Involved in SQL tuning, PL/SQL tuning and Application tuning using various tools like tkprof, explain plan.
- Extensively used materialized views, pragma autonomous transactions, object types, table functions, and pipelined functions to implement business logic effectively.
- Extensively used Global Temporary Tables to load the required data set to increase the performance.
- Worked on PL/SQL Implicit, Explicit, and REF Cursors and created Triggers for tables using SQL Developer.
- Analyzed the business logic with DSO (Data Strategy & Operations) team, identified bugs in the existing code, and efficiently fixed them.
- Handled errors using system defined exceptions and user defined exceptions.
- Experience in Dynamic SQL using DBMS SQL, Records, Arrays and Exception Handling.
- Worked with Collections to reduce overhead involved with cursor looping. Bulk fetched the data from sql engine into collection. Handled exceptions using SAVE EXCEPTIONS and %bulk exception.
- Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
- Experience in working with ORM - SQL Alchemy in connecting with databases and data persistence.
- Worked with Application and ETL teams for high volume data loads and data migrations.
- Performance tuning due to Migrating the application from Oracle 10g to 11g
- 24 X 7 Production Database on Call Support.
- Worked with source teams to resolve data quality issues raised by end users.
- Coordinated with Denodo team to integrate data from Multiple Data sources.
- Enhanced knowledge in building and publishing virtual data services in Denodo.
- Created reports using visualization tool Tableau for analysing the data.
- Exported the production data to test database to run the deployment scripts before moving it to production. This is done while cleaning up polluted production data and updating/deleting large amount of data/tables. Then performed appropriate test, generated reports for comparison and submitted the report to the clients.
Environment: Oracle 11g, UNIX, Rational Rose, SQL Developer, MS Office, MS Excel, Clear Case, HTML, Denodo 5.0, Eclipse, Data Pump Export and Import Utility, python 2.8,pycharm,SQL Alchemy
Senior Database Developer
- Gathering the requirements from Business owners and functional users.
- Designed ER diagram using Erwin, to set the logical and physical relationships of database.
- Participated in improving the performance of the Queries using TKPROF and EXPLAIN PLAN.
- Using set operators in PL/SQL like Union, Union all, Intersect and minus.
- Developed various mappings documents with transformation logic to load data from source to target.
- Developed SQL*Loader scripts to load data in the custom tables. Extensive use of PL/SQL Collections and Records, Partition methods for oracle tables and indexes.
- Extensively used REF cursors to pass out the required data set to the front-end team working on JAVA.
- Utilized Java/J2EE, JDBC and JSP for Database connection to user interface.
- Created schemas, stored procedures, Triggers and functions for Data Migration.
- Performing requirement validation checks in the offshore work products and ensuring all requirements are tracked.
- Extensively used DTS package in Migration of the data from the previous timesheet database in SQL Server database.
- Improved reporting logics built on MS TSQL and PL/SQL by using analytical functions over traditional aggregate functions.
- Dismantled 11 slow running legacy cognos reports built on multi-layered complex BI objects and redesigned it with MS TSQL procedures without any data loss, achieving same precision with low response time.
- Modified existing web application GUI as per the need from the look and feel point of view using HTML, Java script.
- Worked with XML file format and created the workflow in Microsoft Visio.
- Developing the back Office Rules and Related Teams using CranSoft.
- Fixing data issues and writing various SQL scripts and procedures for migration process.
- Supporting production incident troubleshooting. Analyzing current and projected future database performance requirements and defining system needs for development databases.
- Preparing process audit documents which ever required.
Environment: Oracle 10g, Windows XP, Rational Rose, Back Office(Cransoft), Optima (Project management tool), Microsoft Visio, SQL Developer, MS Office, MS Excel, SQL Server 2005, XML, JDBC, JSP