We provide IT Staff Augmentation Services!

Mongo, Postgres, Mysql, Oracle, Vertica, Sql Server And Hadoop Administrator Resume

3.00/5 (Submit Your Rating)

Glendale, CA

SUMMARY

  • A collaborative engineering and administration professional with substantial experience in designing, database administration and executing solutions for complex business problems involving large scale data warehousing, real - time analytics and reporting solutions. Known for using the right tools when and where they make sense and creating an intuitive architecture that helps organizations effectively analyze and process terabytes of structured and unstructured data.
  • Expert on MongoDB, Postgres, MYSQL, SQL Server, Oracle & Vertica DBA
  • Expert as a Hadoop Admin Hadoop-based data deployed in Cloudera, Horton Works or MapR.
  • Substantial experience as an Oracle Consultant worked in PG&E, Netscape, Sun Microsystem, Compaq and HP
  • Good experience on Search Engine technology handled terabytes of data in Yahoo & Shozilla Confidential .,
  • Expertise on Database Replication technology in Oracle Goldengate, Oracle Streams, Shareplex, Mongo, SQL Server, MYSQL and Postgres
  • Experience on infrastructure and operational support in Ameriquest Confidential ., Dreamworks Animation and Confidential Confidential .,
  • Extensive experience in data warehousing and ETL operation

TECHNICAL SKILLS

Storage: EMC, 3par, IBM XIV, Netapp, JPOD

Operating: Systems Solaris, HP UX, DOS, Windows, AIX 4.3.3, Linux Red-Hat, IOS

Languages: Unix Shells, Pearl and python scripts and NT Scripts, C, C++

PL/SQL and COBOL:

RDBMS/DBMS: Oracle, SQL-Server, Sybase, Postgres

NoSQL/Tools/Utilities: HP Vertica, Mongo, Cassandra, Elasticsearch, HBASE, Netezza, Kafka, Spark, Scala, Hive, HDFS, Yarn, MySQL, Sqoop, Fuse, Salt Stack, Docker, Vagrant, Toad, Oracle Enterprise Manager, Erwin, MS Project, Informatica, Oracle Goldengate & Streams, Datadog, pig

PROFESSIONAL EXPERIENCE

Confidential, Glendale, CA

Mongo, Postgres, Mysql, Oracle, Vertica, SQL Server and Hadoop Administrator

Responsibilities:

  • Installed, configured 3 node(s) mogo cluster 3 data center(s) with proper replication with shard with label's
  • Upgraded multiple versions from 2.4 to 3.0 with new hardware, implemented new features in terms development & storage of Mongo
  • Havely used GridFS stored images and enabled embedded documents to maintain relationships
  • Worked closely with development team for enhancement new deployment of application, performance tuning their Covered queries with their data model
  • Troubleshooting the performance db by setting up profiling, analyzing query and implement right query plan.
  • Guiding application team to use Map reduce functions, create indexes, capped collection and text search as much they can.
  • Removed the cluster (Redwood city closure) remapped/data migrated for 2 data centers
  • Implemented good backup strategy in case of restore process, refreshed non production for enhancement of application
  • Installed, configured Object oriented Postgres instances support hot stand-by for production with best practices tuning parameters
  • Upgraded multiple version(s) of OS and Postgres software, It is implemented Docker level support to reduce workload on multiple version of docker images.
  • Worked closely with application team for migrate data from Oracle to Postgres
  • Involved multiple fail-over test by different data centers
  • Maintaining daily dba activity on multiple postgres instances
  • Installed, configured multiple MYSQL instance in various projects including Hadoop repository, performed multi master replication replace Core Oracle internal application env., Evaluate the performance of multiple storage engine(s)
  • Configured and maintained multi master replication for 3 data center
  • Tuned the database host level and application level too
  • Closely worked with development team for data and code migration
  • Some of the internal application migrated from Oracle to MYSQL to reduce the cost of licensing, worked as a data engineer bridge between application and DBA team
  • Install, configure and maintaining Horton works ( Hive, Spark, Yarn, Ambari, Zookeeper, Kafka & MYSQL Repository)
  • Deployed Big Data solution in Vertica and Hadoop.
  • Maintaining 15 node(s) Hadoop production Clusters ( capacity, tuning, backup & upgrade, cleanup etc.,)
  • Maintaining 5 node(s) Non product Clusters ( capacity, tuning, backup & upgrade, cleanup etc.,)
  • Proficient in database design principles, strategies, and methodologies. Proficient in Data Warehouse design and common methodologies.
  • Practical experience with stored procedures, indexes, performance tuning, ETL, clustering, Vertica, data compression, and Big Data solutions. Designed Vertica Cluster, Data Structures, and ETL processes. Performance tuned Vertica Queries.
  • Created Vertica projections using DBD and by hand.
  • Vertica lifecycle upgrades through 6.x, 7.x and support case resolutions.
  • Helped with Migration Efforts and Product lifecycle releases, operational support including 24x7 on-call support.
  • Worked with Product Management, Architects, Developers, Testing and Support teams.
  • Conducted various POCs & Sandbox installations for various groups to include Vertica.
  • Installed Vertica Analytics Engine, Management Console, created Logical & Physical Schemas with tables & Projections via Database Designer.
  • Very good understanding of Vertica Data Encoding & Compression, High Availability ( K-Safety), Hybrid Storage Model ( WOS, TM, ROS), Bulk Loading, vSQL & historic snapshot queries, understanding query plans & Flex Zone - SQL on JSON & Hadoop.
  • Designed, implemented, automated and monitoring Multi master Replication for STUDIO data by Oracle Streams and Goldengate technology
  • Installed, maintaining, upgrading Oracle version, storage upgrade, OS upgrade, server upgrade, network upgrade of multiple Oracle RAC prod and non prod db's to support core business of Dreamworks 3 studio ( Glendale, Redwood city and India)
  • Deployed new functionality implemented from development to production
  • Involved many occasions for swat team to replication issues performance and data corruption etc.,
  • Retired Dataguard in Redwood city and enabled replication to new data center to Los Vegas ( storage replication on HP 3par)
  • Working with Application Stake holders defined right tool with design and transfer data from RDBMS to NoSQL environment (Vertica, HDFS Elasticsearch)
  • Working on DB build as a service using Salt Stack, Docker & Open-Shift
  • Completed Render and Media applications design, it is up and running
  • Replaced splunk to ElasticSearch to minimize cost saving and retaining more data to be useful
  • Replaced Unix cron into RUNDECK (oracle/PostGres as a repository)
  • Allocated and maintained HP 3par storage to all DB env.,

Confidential, Santa Monica, CA

Sr. Database Administrator/Data Engineer

Responsibilities:

  • Performed Data Engineer/RDBMS DBA role
  • Worked with multiple team established secure database environment
  • Worked with data warehouse, Business and DBA teams stream lined data from Real time OLTP to NoSQL environment., Used Shareplex Change Data Captured (CDC) it helps accelerate data warehousing report performance, removed complexity code from Real time OLTP data base, easy to support, improved performance of source and target databases.
  • Tuned Big Data environment introduced Impala Hive, External Hive table(s) and different compression method
  • Re architect RDBMS real time database environment

Confidential, Santa Monica, CA

SQL/NOSQL and Hadoop Administrator/System Architect

Responsibilities:

  • Maintained 85 node(s) Hadoop production Clusters ( capacity, tuning, backup & upgrade, cleanup etc.,)
  • Maintained 12 node(s) Non product Clusters ( capacity, tuning, backup & upgrade, cleanup etc.,)
  • Maintained Hadoop tools ( Hive, Sqoop, spark, Zookeeper, Yarn )
  • Performed RDBMS/NoSQL DBA, Data Engineer and Data Architect and System Architect role
  • Helped data warehousing team to secure data, keep the data based on SOX policy, implement right data purge and keep up & running with high performance of DB 24/7 both production & non-prod.
  • Worked with management, Infrastructure and DBA teams build new OLTP Oracle system to support Geographical distribution from exiting Sybase and its replication system.
  • Re architect Data warehousing system which cost less compare to existing hardware, throughput of the system increased more than 20 times
  • Transferred and helped 45% Oracle warehousing data to HDFS, it helps data scientists have freedom to work on more data

We'd love your feedback!