Database Architect Resume Profile
3.00/5 (Submit Your Rating)
Pleasanton, CA
SUMMARY
- Possess over 15 years of experience in comprehensive Applications analysis, design, development, testing, implementation, Systems Integration, Data Migrations, Data Conversions, Decision Support/Reporting Systems,
- performance tuning, database release management, data interfaces/refresh processes for SOLR and some Oracle Database Administration, using the Oracle RDBMS on UNIX and LINUX platforms.
- I have worked in companies associated with the following industries: Banking and Finance, Mobile Software, Telecommunications, Comprehensive Wealth Management, Databases, Software Development, Internet Web Site Personalization and Risk Management and Credit Analysis.
PROGRAMMING LANGUAGES
PL/SQL, SQL, Korn Shell, sed and awk, Perl, HTML, C, Java Script, XML
RDBMS
Oracle 7.x, Oracle 8i. Oracle 9i, Oracle 10g, Oracle 11g
SOFTWARE TOOLS/UTILITIES
- Sql Loader, Sql Plus, Oracle Import/Export, Oracle Enterprise Manager
- Toad, Sql Net, C Isam, ER-Win data modeling , Sql Navigator, Sql Net, ODBC, Benthic software tools
SOFTWARE PACKAGES
- Oracle Financials 10.6, Oracle Order Entry and Shipping Modules of Oracle Manufacturing
- Minx Manufacturing Application , ASK Manman Manufacturing Application
- Telecommunications Record Information Billing System TRIS
- DataFaction High Networth Client Accounting System
OPERATING SYSTEMS HARDWARE PLATFORMS
- HP-UX - HP V Class Machines.
- Sun Solaris - Sun Enterprise Series.
- UNIX 4.2 BSD and System V - MC68030 based minicomputer running UNIX
- Windows NT/2000 Server Enterprise Edition - Dell, dual Pentium CPU System
- Red Hat Linux
Work Summary
Database Architect
Confidential
- Performance Tuning Of Oracle Claims, Quote, Legacy, Billing and Fraud Data Warehouse ETL processes. Data Warehouse size: 15 20 Terabytes . Improved performance by 10 - 20 times and also made the Data Warehouse systems scalable with data growth.
- Designing, developing, testing and implementing, load and performance testing of new ETL processes for Data Warehouses.
- Working with Big Data Team to design, develop, test very large data extracts and interfaces from OLTP source Databases for Hadoop based Big Data ETL processing.
- Designing, developing, unit/integration testing, load and performance testing of data refresh processes from Oracle data warehouses and OLTP databases required for SOLR searches. The main key behind this de-normalization which makes the SOLR merge and re-index of its file structures very fast. This process made the SOLR data merge and re-index 20 30 times faster, as the refreshes were incremental. OLTP Database size was 7 Terabytes.
- Designing, developing and implementing data archival strategies for the Oracle data warehouses and OLTP databases thereby making the ETL processes much faster and scalable and also making the OLTP batch processes faster and efficient and scalable.
- Reviewing SQL and PL/SQL code written by developers from Data Warehouse and OLTP teams and documenting issues with their code from performance perspective, exception handling, debug and error logging. dead lock and locking issues, context switching issues that can cause huge bottle necks, commenting of code and code indentation and readability, using correct query hints for parallel processing, bypassing redo logs, adding the missing indexes, adding the missing query filters that can make the query much more efficient, or re-writing the SQL query in a more efficient manner, creating tables/indexes for data warehouses in larger block size tablespaces like 32k or 64k, for faster read and write performance, pinning tables in the Oracle SGA for improving Dashboard query performance and so on.
- Working with developers to debug and fix performance and database issues in production, QA and UAT databases.
- Working with Oracle DBAs to change Oracle server parameters for performance improvements
- Working with Oracle DBAs to monitor and providing quick and long term fixes for database performance issues caused by ETL processes.
- Working with Release Management Team and Developers to plan, document and implement and validate Database Releases for Data Warehouses and OLTP databases on Linux, Sun Solaris and HP-UX operating systems, for production Releases.
- Documenting Release notes, applying and validating database code Releases on all Development, QA and UAT databases.
- Working with Load and Performance testing teams.to make sure that new code changes for Data Warehouse ETL did not impact the SLA and also the new OLTP database code did not load the OLTP database and did not impact the new and existing query performance.
- Performing POCs for Data Warehouse and OLTP teams to provide high performance and scalable database solutions.
- Working with developers and DBAs from other database application teams almost 10 of them, like HR, Oracle Financials, Payments, Billing etc to tune SQL, batch processes and understand in detail and provide production database fixes.
Confidential
Data Warehouse Architect
- Completely re-architected, re-designed and re-wrote the Data Warehouse and ETL processes, thereby making them more scalable, reliable and improved process performance by 15 - 20 times. Took complete ownership of the entire project from design to deployment. This project was developed for Mobile Update customers with very large data sets like Verizon, Tata, US Cellular, Intel, Qualcomm, Fijutsu and Orange. This project was implemented as a SaaS hosted service model for some customers.
- Architected, designed, developed and implemented Datawarehouse and ETL solution for Active Care customers, with data containing intelligent solutions for Mobile Carriers, OEMs and internal operations. This was implemented as a SaaS hosted service model. Took complete ownership of the project from design to deployment.
- Architected, designed, developed and fully automated data extracts for very large customers like Verizon and Tata to provide list of mobile phones with outdated firmware. These extracts helped the customers in monitoring and improving the FOTA Firmware Over The Air Update , success rate, and thereby reducing customer service calls for mobile device issues related to outdated firmware. Took complete ownership of the project from design to deployment.
- Architected, designed, developed, tested and implemented the end to end data mobile analytics warehouse/ETL solution, required by OEMs and Network service providers, to provide detailed analysis for the following business cases: Cell Phone battery heating up quickly, Cell Phone battery level drains quickly, Dropped Calls, Phone Reboots, Application crashes on Cell phone.
- These business cases help the OEMs to use the detailed analysis provided by the Mobile Analytics UI and Reports to have intelligent conversations with Network providers regarding why the cell phone battery heats up, cell phone battery drains quickly, possible reasons for dropped calls, phone reboots and application crashes.
- These business cases help the Network Service Providers to use the detailed analysis provided by the Mobile Analytics UI and Reports to have intelligent conversations with customers regarding why the cell phone battery heats up, cell phone battery drains quickly, possible reasons for dropped calls, phone reboots and application crashes.
- This was built to reduce the cell phone returns that are causing the OEMs and Network Service Providers to loose billions of dollars in revenue and also to reduce damage to cell phone brand names, thereby reducing loss in sales revenue.
- Data Warehouse size was about 7 terabytes, with a daily data volume from 5 10 million.
- This project was implemented as a SaaS hosted service model.
- Re-architected, re-designed, developed and fully automated the Java based batch processes used for Mass Campaigns, Device Import, Subscription Import, thereby making them scalable, reliable and improved their performance by 20 30 times. Took complete ownership of the project from design to deployment.
- Provided database performance solutions in production emergency situations for the following very large volume database processes that perform the following:
- Delete orphaned devices and subscriptions.
- Data fix for devices.
- Provided data fixes for random issues created in production due to inefficient sql scripts, that performed incomplete data updates or had to be killed due to performance issues, thereby causing data to be inconsistent.
- Acted as mentor for Java developers and database developers trying to implement database scripts.
Technology Used:
PL/SQL, SQL, Sql Loader, Sql Plus, korn Shell, Oracle 10g and Oracle 11g, on Linux and UNIX, Oracle Enterprise Manager, AWR, Toad, sed and awk, Oracle import/Export utilities, Sql Trace, automated via cronjobs.