We provide IT Staff Augmentation Services!

Hdp Admin Resume

Houston, TX

PROFESSIONAL SUMMARY:

  • Technically accomplished professional with over 18 years of experience in Information technology and Enterprise Application Development in multiple industries, around 3 years of hands on experience as HDP Administrator and over 9 years as Oracle Database Administrator and Enterprise Application Development in multiple industries.
  • Hands on Experience in Development and Design Framework for Hadoop Eco - systems
  • Responsible for implementation and ongoing administration of Hadoop infrastructure using Ambari.
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
  • Partner with Client, Development, and support and Platform teams to delivery on time results.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Point of Contact for Hadoop related escalations
  • Experience in leading a Development, Support and Platform teams
  • Experience in Data Mining and Data Modeling, EDW and Oracle Database Administration
  • Coordinating with the various global Support and Application Teams.
  • Interacting with project management, Analysts in the Scoping, analysis, design, planning and governance of implementing E2E project solutions.
  • Partner with Client, Development, and support and Platform teams to delivery on time results.
  • Familiar with data architecture Activities like Data modeling, data ingestion pipeline design and Experience in optimizing ETL work flows.
  • Hands on Experience on Data Extraction, Transformation, Loading Data, Data Visualization using hortonworks Platform HDFS, Hive, Sqoop, Hbase, Oozie, Beeline, Yarn, impala, spark, Scala, Vertica, Oracle, MS-Sql
  • Experience working with Mapr & Hortonworks Distribution of Hadoop.
  • Involved in MapR to Hortonsworks migration
  • Strategy planning, sizing preparation
  • Expert on vertica database optimization, designing storage projections and segmentation.
  • Knowledge in NoSQL databases like HBase, Mongo DB.
  • Oracle Database Administrator 9 Years’ experience in installation, configuration, performance tuning, Backup and Recovery and data optimization areas.
  • Establishing best practices for the organization.
  • Experience in Release Management and Planning Down time activities for major activities like software upgrades or firmware upgrades.
  • Strong working experience on Requirement analysis, System design, Storage estimation and functional Testing.
  • Experience as On-Call 24x7 production DBA support, application/development Engineer.

TECHNICAL SKILLS:

Bigdata Ecosystem: HDFS, Hive, Sqoop, Oozie, falcon, Beeline, spark, kafka, HBase

SDLC Methodologies: Agile, Waterfall

Programming Languages: Shell Scripting, Vsql, SQL and PLSQL

NoSQL Databases: MongoDB, Hbase

Operating Systems: Windows Family, LINUX, HP UX, AIX

Scripting Languages: Shell, sql

Databases: Vertica, Oracle, MS - SQL Server, MS Access, DB2

Build Tools: Putty, Nagios

Scheduling Tools: Tidal, Autosys

Other Tools: HPSM, ALM

PROFESSIONAL EXPERIENCE:

Confidential, Houston, TX

HDP Admin

Responsibilities:

  • Responsible for implementation and ongoing administration of Hadoop infrastructure.
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Refreshing data between the cluster as per needs
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Point of Contact for Vendor escalation
  • Involved in Scoping and design phase to implementation phases
  • Supported Development team for designing and automating workflows
  • Configured Vertica instance on Hadoop to perform the data loads from hive to Vertica DB
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

Environment: Hortonworks, Ambari, Vertica DB

Confidential

Big Data Engineer

Responsibilities:

  • Imported the analyzed data to the HDFC using Sqoop
  • Designed different Data Processing layers on Vertica as per the needs
  • Data Model created for Vertica DB data layer and created schema
  • Load and transform large sets of structured, semi structured and unstructured data even joins and some pre-aggregations before storing data into HDFS.
  • Extracted the data from EDW into HDFS/Databases/Dashboards using Sqoop.
  • Involved in requirement and design phase to implementation phase
  • Automated all the jobs, for pulling data from relational databases to load data into Hive tables, using Oozie workflows and enabled email alerts on any failure cases
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Setup Hadoop and vertica cluster and managing databases.
  • Managed and reviewed Hadoop log files

Environment: Hadoop MapR, HDFS, sqoop, vertica, Oracle, MS-Sql, TIDAL, Batch scripting, VSQL, Erwin

Confidential

Data Modeler and Migration Engineer

Responsibilities:

  • Data Modeling using Erwin
  • Space Management and growth estimates
  • Creation of database objects
  • Data Migration
  • Automating & scheduling TIDAL jobs
  • Performance tuning

Environment: Oracle, MS-Sql, TIDAL, Batch scripting, SQL

Confidential

Sr. Database Administrator

Responsibilities:

  • Experience as On-Call 24x7 production DBA support, application/development DBA.
  • Extensive knowledge on database administration for Oracle 8i, 9i, 10g and 11g, with experience in very large-scale database environment and mission critical large OLTP and OLAP systems.
  • Experience in managing Oracle databases running on HP-UX, MS-Windows, NT Server, UNIX and AIX.
  • Hands on experience in Oracle 10g/11g RAC implementation & administration (Oracle Cluster ware setup and configuration, RAC installation using ASM, Grid Installation).
  • Good experience RAC Cluster ware administration,
  • Configuration and Patching. Extensive experience in Data guard configuration, implementing and maintaining standby databases. Hands on experience in Grid control, Streams, Golden Gate Replication.
  • Experience in Performance Tuning using EXPLAIN PLAN, SQLTRACE, TKPROF, STATSPACK, AWR and ADDM.
  • Setup Oracle Enterprise Manager (OEM) 10g & Grid Control (11g) for database monitoring/performance/diagnostics.
  • Extensive experience in Database Backups RMAN (Full/Incremental backups) and in traditional hot/cold backups.
  • Experience in LOGICAL BACKUPS and Database migration tools with DATA PUMP, Export/Import and Transportable Tablespaces.
  • Good experience in writing/editing UNIX Shell Scripts. Good experience in PATCHING AND UPGRADATION.
  • Expertise in upgrading Oracle Development and Production databases from 8i to 9i, 9i to 10g & 11g, 10g to 11g.
  • Extensive knowledge and experience on managing very large Data Warehouse databases.
  • Proficient in SQL, PL/SQL, Stored Procedures, Functions, Cursors and Triggers. Extensive experience in configuring and implementing ASM and proficiency in using ASMCMD and ASMLIB.
  • Experience in Flash-Back Technology.
  • Experience in Database Refresh and patch management and in Database cloning with RMAN and manual methods.
  • Experience in Capacity Planning and Bench Marking. Developed database monitoring/health check alert scripts for database uptime/downtime status, and sizing issues using grid control (OEM).
  • Proficient in raising TAR with Oracle Support and using Meta link to resolve the bottlenecks in the database environments.
  • Excellent Communication skills, co-ordination and communication with system administrators, business data analysts and development teams

Environment: Oracle, OEM, HP-UX, Windows NT

Confidential

Data Migration Engineer

Responsibilities:

  • Oracle 10g server installation and setup databases
  • Security profile management
  • Data modeling and Schema creation
  • Success full migrated Schema from IBM OS/390 DB2 7.1 to Oracle 10g HP-UX and mapped Data types
  • Automated Data Migration
  • Application and data base tuning
  • Involved in real time replication from DB2 to Oracle and Oracle to DB2

Environment: Oracle, HP-UX, IBM OS/390 DB2 7.1, SQL ways, Windows

Confidential

Database Administrator

Responsibilities:

  • Managing the database's storage structures
  • Performs the capacity planning
  • Managing users and security
  • Schema Migration, such as tables, indexes, and views
  • Making database backups and performing recovery when necessary
  • Proactively monitoring the database's health and taking preventive or corrective action as required
  • Monitoring and tuning performance
  • Troubleshoots with problems regarding the databases, applications and development tools.
  • Alert log monitoring and Trouble shooting
  • Monitor application related jobs and data replication activities
  • Implement and maintain database security

Environment: UNIX, Oracle 8i, Autosys

Confidential

Application Developer

Responsibilities:

  • Built PL/SQL code base
  • Created Forms and reports using Oracle D2K
  • Migrated D2K to Developer 5.0
  • End user training Support during New Financial year changes

Environment: AIX, Windows NT, Oracle 7.1, D2k, Developer 5

Hire Now