We provide IT Staff Augmentation Services!

Database Architect Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Oracle PLSQL Developer/Architect with 11 plus years of work experience with Application Development in Database Design and Development, Data Warehousing Design, Model and Development, ETL development and involved in requirement analysis, database design, data modelling, process flow analysis, development, benchmark analysis, performance tuning, capacity planning, production implementation and production support of business applications and associated in all phases of the SDLC (Software Development Life Cycle) with timely delivery against aggressive deadlines.
  • Over 11 years of work experience in Oracle Application Development, currently associated with Harman Connected Services ( Confidential ) since 7 plus Years.
  • Good experience in SQL and PL/SQL Performance Tuning.
  • Good Experience on Snowflake Datwarehouse & Cloud Analytics.
  • Building and Deploying ETL and data pipelines using Spark and Python.
  • Working experience with SQL and MPP systems (SQL, SPARK SQL, SnowSQL, HIVE etc)
  • Designed and Developed many highly scalable and extensible staging and data warehouse applications and self - service platforms which enables collection, storage, modelling and analysis of massive data sets from different structured, semi-structured and un-structured sources.
  • Designed and developed data models (Entity Relationship (ER) Model, ER to Relational Model, Object Oriented Data Models and Network or Hierarchical data models) and data warehouse methodologies (Bill Inmon & Kimball Modelling techniques).
  • Well versed in agile practices for software project development.
  • Working Experience with RDBMS technologies Oracle PL.SQL, Oracle Database 12c, Toad, Toad Modeller, SQL Developer).
  • Good Knowledge on Big Data Concepts and Technologies (Apache Hadoop, HDFS, Apache Hive, Hue, Sqoop).
  • Working Experience with Cloud Technologies (Amazon AWS S3, Microsoft Azure) for File Transfer.
  • Good Knowledge on AWS Web App 3-tier architecture and AWS services - EC2, ELB, ASG, RDS, ElastiCache, S3, CLI, Elastic Beanstalk, CICD, CloudFormation, CloudWatch, X-Ray, CloudTrail, SQS,SNS, Kinesis, AWS Lambda, DynamoDB etc.
  • Hands-on experience on Scheduling Tool Control - M & Autosys.
  • Hands-on experience with Object Oriented Programming Languages (Python, PRO*C/C++).
  • Excellent Analytical skills, Mathematical skills, communication, interpersonal, and strong ability to perform to lead many technical teams.
  • Work on experience on Check-in & check-out tool TFS, Version Control Tool git.

TECHNICAL SKILLS

Programming Languages: Oracle Programming Language (PL/SQL) 12c, UNIX Shell Scripting, Hive Query Language, PRO*C/C++, Python

Databases: Oracle PL/SQL, Oracle Database 11g.12c

Datawarehosuing Systems: Snowflake, AWS Redshift

Big Data Technologies: Apache Hadoop, Apache Hive, HDFS, MapReduce

BI Reporting Tools: RPM

Operating Systems: LINUX, WINDOWS

Technology Tools: TOAD, Toad Data Modeller, PUTTY, ALTERYX, AUTOSYS, CONTROL M, SQL DEVELOPER, INFORMATICA, WIPSCP, HUE

File Transfer Technologies: Amazon Web Services (AWS) S3, MICROSOFT AZURE

IT Processes: Software Development Life Cycle (SDLC), Agile Process

Hardware: Cluster Servers

PROFESSIONAL EXPERIENCE

Confidential

Database Architect

Responsibilities:

  • Leading Database Technical Team (Data Warehousing Development, DWH).
  • Work with system operations personnel and other application teams in production support and defining system recovery procedures. Identify root causes and create solutions for application failures.
  • Pivotal in building the offshore team (Bangalore, India) with knowledge transition. Act as subject matter expert (SME) for several key existing warehouse applications, providing assistance to support teams as needed.
  • Collaborate with business analysts, design analysts and other technical/functional teams to understand and gather requirements related to DWH project.
  • Evaluate the project Requirements to determine the Feasibility, cost and time required for development of the solution.
  • Design and develop staging and data warehousing applications of the team to support the requirements.
  • Plan, develop, design, test, implement, and support custom proprietary warehouse applications in various software languages, platforms and environments.
  • Providing timely status reports to management on team project accomplishments, issues, etc.
  • Design highly scalable and extensible staging and data warehouse applications which enables collection, storage, modelling and analysis of massive data sets from different structured, semi-structured and un-structured sources. Help the team to understand the design and proceed to develop.
  • Design and develop the conceptual/Logical/Physical Data Models using Toad Data Modeller for data warehouses, data marts and Intelligence Imputation Modules.
  • Develop and maintain Complex Code Units, perform quality Check and debugging for warehouse applications as per requirement:
  • Develop PL/SQL Packages, procedures, Functions, Triggers, views, Exception Handing and others for retrieving, manipulating, checking and migrating complex data sets in Oracle.
  • Build self-service applications and platforms using Oracle, Hadoop, Hive, Python, AWS and Azure to ingest the data, perform data profiling and perform transformations on data for enterprise data warehouse. End to end Testing and Deployment of applications and solutions for reducing the operational cost.
  • Create Oracle Schema and Databases for Client warehouses, perform capacity planning (Disk space, RAM, CPU etc.) And collaborate with DBA to assign required amount of spaces to each schema.
  • Develop Oracle Data Structure built-in and user defined objects like Associative Arrays, varrays, Nested Table to support warehouse applications.
  • Develop custom defined functions utilizing numerical and mathematical approximations algorithms.
  • Enforce business rules in the data warehouse using different Integrity Constraints such as Primary Key constraints, Foreign Key Constraints, Unique Key Constraints etc.
  • Enhance performance, manageability and availability of different applications and reduce cost for storing large amounts of data using table partitioning and index partitioning.
  • Experience in interpreting AWR Reports.
  • Create different types of SQL statements such as Data Definition Language (DDL), Structured Query Language (SQL), Data Manipulation Language (DML), Transaction Control Language (TCL) and Session Control Statements for advanced and complex queries to support warehouse applications.
  • SQL and PL/SQL Performance Tuning.
  • Oracle Application Tracing using DBMS OUTPUT, DBMS UTILITY, DBMS APPLICATION INFO, EXPLAIN PLAN..
  • Identifying bottlenecks in PL/SQL Codes using DBMS PROFILER and DBMS HPROF.
  • Build various Healthcare Intelligence Machine Learning Modules like Imputation and Patient Transactional Dataset using emerging technologies like Oracle PL/SQL, Hadoop, Python, Hive, Toad, Putty, SQOOP, Unix Shell Scripting, AWS S3 and Microsoft Azure. Impute different Healthcare data and perform predictive analysis. Develop data imputation machine learning (ML) algorithms to implement the custom Imputation Process using Time Series Datasets, Test Datasets, Train Data sets, Statistical Analysis, Principal component analysis (PCA), Vectors, Matrices, Linear Algebra, Probability Theory and Regression Models.
  • Automate any repeated manual process to Oracle/Hadoop HiveQL/AWS S3/Azure workflows using Autosys scheduler and UNIX Shell Scripting.
  • Reduce the operational cost of existing application by bringing in innovations and processes to automate.
  • Focused on identifying the pain areas, mundane task and propose solutions to automate products concentrating on reducing cost, manual effort and human prone error.
  • Creating automation models, which will not only save time having it done manually also save operational cost.
  • Perform various Data Migration, Data Modelling and Performance Tuning Activities:
  • Perform Data Migration activities SQL*Loader, Import and Export Utilities, RDBMS to Hadoop using Sqoop and between different RDBMS databases.
  • Design data warehouse models (Entity Relationship (ER) Model, ER to Relational Model etc.) and data warehouse methodologies (Top-Down, Bottom-Down). Develop multidimensional schemas like Star Schema, Snowflake schema or Galaxy Schema as per data warehousing requirements.
  • Perform SQL and PL/SQL Performance optimization by analysing Execution Plan, SQL Profiler, DB Tuning Advisor, Hints, Query Plans, Plan cache, statistics & Extended Events and using built-in functions DBMS PROFILER, DBMS HPROF, DBMS UTILITY, Deterministic Function Caching, Package-based Caching and Function result Caching.
  • Develop QC Reports to identify data anomalies and trend differences for high revenue generating client warehouses. Create descriptive reports using Alteryx tool if required. Develop automation frameworks to expedite the QC reports, Trending reports and adhoc reporting requirements to validate the data.
  • Design and develop operational data storage (ODS) and multi-dimensional DataMart model storage solutions. Develop standalone storage applications in Python using Object Oriented Concepts (OOP) and data structures to support data edits and reconciliation. Managing storage application errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application. Develop Built-in and user defined PL/SQL data structures to support storage application maintenance and use of parameters to build the process flow.
  • Develop and maintain Complex Code Units:
  • Build self-service applications and platforms using Oracle, Hadoop, Hive, Python, AWS and Azure to ingest the data, perform data profiling and perform transformations on data for enterprise data warehouse. End to end Testing and Deployment of applications and solutions for reducing the operational cost.
  • Maintain Oracle Data Concurrency and Consistency using Multiversion, Transaction-level and Optimistic Concurrency control methods.
  • Troubleshoot and fix the production automated job failures, data Investigation, provide a solution, fix the issue and deploy the updated code into Production. Ensure daily, weekly and monthly jobs run smoothly in Autosys. Work closely with high revenue generating users to understand their requirements, build their warehouses/data marts, and provide assistance.
  • Develop efficient hive scripts with joins on datasets using various techniques. Loading data from different datasets and deciding on which file format is efficient for a task. Build distributed, reliable and scalable data pipelines to ingest and process data in real-time. Reviewing and managing log files. Investigate and resolve the Autosys job failures. Use bulk collect and pipeline methodologies to transfer huge data in lesser time.

Confidential

SENIOR SYSTEMS ANALYST

Responsibilities:

  • Understanding and Collecting requirements from Business user side or Business Analyst and try to convert that requirement to SQL queries, PL/SQL.
  • Design Documentation for the development work.
  • Design and develop various Intelligent Modules, Data Warehouses and Storage solutions using Oracle Database, Hadoop and Python.
  • Expertise in SQL and PL/SQL programming, developing complex code units, PL/SQL Packages, Procedures, Functions, Triggers, Views and Exception handling for retrieving, manipulating, checking and migrating complex data sets in Oracle.
  • Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.
  • Involved in Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modelling and design techniques.
  • Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.
  • Partitioned large Tables using range partition technique. Worked extensively on Ref Cursor, External Tables and Collections.
  • Work in SQL performance tuning using Cost-Based Optimization (CBO). Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes. Performance Tuning for Oracle RDBMS using explain Plan and HINTS.
  • Troubleshoot and fix the production automated job failures, data Investigation, provide a solution, the issue and deploy the updated code into Production. Ensure daily, weekly and monthly jobs run smoothly in Autosys.

Confidential

IT ENGINEER

Responsibilities:

  • Collaborate with business analysts and team leader to understand and gather requirements.
  • Production Support to Troubleshoot Provisioning issues and resolve open tickets.
  • Design Documentation for the development work.
  • Develop Backend Oracle PL/SQL Stored Procedures and Packages for data warehouses and data marts. Develop various data structures to support the application.
  • Design and Develop Various Banking Intelligence Modules.
  • Perform Unit Testing during the Development Phase.
  • Develop UNIX scripts to process files from various Interfaces and call various Oracle Procedures. Call PRO*C/C++ programs inside UNIX script for embedding complex database objects and implementing the complex backend logics.
  • Create various Database Objects and Crontab entries for Oracle Backend processes.
  • Design and develop both ad hoc and static reporting solutions. Design and administer data quality mechanisms.
  • Automate Oracle and UNIX jobs using Scheduler tool Autosys.
  • Resolve Control M Job Failures and move the updated code into production.

Confidential

ASSISTANT SYSTEMS ENGINNER

Responsibilities:

  • Collaborate with business analysts and team leader to understand and gather requirements.
  • Develop comprehensive data integration solutions. Develop and maintain semantic layer(s) for multiple types of access.
  • Deliver robust test case, plans and strategies. Handle all phases of software development life cycle; facilitation, collaboration, knowledge transfer and process improvement.
  • Troubleshoot issues related to existing data warehouses to meet client deliverable SLAs.
  • Work on SQL performance and PL/SQL performance tuning using Cost-Based Optimization (CBO).Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes. Performance Tuning for Oracle RDBMS using explain Plan and HINTS.
  • Automate Oracle and UNIX jobs using Scheduler tool Autosys.
  • Resolve Autosys Job Failures and move the changed code into Production.

We'd love your feedback!