Hadoop Adminstrator Resume
Denver, ColoradO
SUMMARY
- An overall 8 years of Information Technology experience on managing small to medium IT Projects.
- Hands on experience in database management, data management, deployments, release management, implementing high availability and maximum availability solutions, managing very large environments, application development and application management.
- me has experience in architecture, design and development of Big Data platform including large clusters, Hadoop ecosystem projects and custom map - reduce job using Java, User management, cluster management, Setup NOSQL databases, Security design and Implementation.
- me has Automation experience using VB and Macros. Championed several initiatives project within CVS Caremark, most notable being responsible and accountable for delivering a utility which saves CVS $35 million annually and possibly avoid sanction which could has resulted in loosing whole MedD business worth $500 million.
- Hands on experience on Mainframe development and manual testing.
TECHNICAL SKILLS
Database: NoSQL, Oracle, DB2, IMS Database, MongoDB, HBASE
Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Mongo DB, Zookeeper, Hive, Pig, ZKFC
UNIX Tools: Apache, Yum, RPM
Languages: Chef cookbooks, small shell scripts, Pig basics, Cobol, CICS, VSAM, DB2, IMS, JCL, REXX, MVS
Configuration management tools: Chef, Puppet
Apache: Hadoop Map Reduce, YARN, PIG, Hive, Sqoop, Flume, Oozie, Zookeeper, HBase, MongoDB, Hortonworks and Cloudera, Ganglia for server management and Kerberos for Hadoop security.
Healthcare tool: Facets, QNXT, Fazal
Mainframe: Z/os, Endeavor, Xpeditor, IBM Debugger, IBM FILEAID, CHANGEMAN, IBM Debugger, IBM FILEAID, MVS, Trace Master, INSYNC, METAMON
PROFESSIONAL EXPERIENCE
Confidential, Denver,Colorado
Hadoop Adminstrator
Responsibilities:
- Install, Configure and maintain Single-node and Multi-node cluster Hadoop cluster.
- Interacted with Windows server management team to setup multiple virtual Windows Application server on a single Physical box.
- Setup cluster environment for Highly Available systems.
- Test failovers to secondary and failback to primary on VM clusters for Application and SQL VM’s
- Test Disaster recovery system by routing traffic to alternate Disaster recovey server with halp of LAN team.
- Installed Apache Hadoop 2.5.2 and Apache Hadoop 2.3.0 on Linux Dev servers
- Upgrade Apache Hadoop from version 2.3.0 to 2.5.2 on Linux server
- Implementing High Availability system for Hadoop Name node
- Installing and configuring KAFKA 2.6 version
- Installing PIG, HIVE on multi-node cluster
- Configuring SQOOP to import data from external database - SQL Server and MYSQL.
- Configured users on Hadoop for HDFS and Map Reduce
- Setup Hive and NoSQL on remote metastore
- Integrating PIG, Hive, Sqoop on Hadoop
- Monthly Linux server maintainence, shutting down essential Hadoop namenode and data node, job tracker and task tracker. And restarting Hadoop services including Yarn.
- Plan and Maintain architrcture of cluster
- Kerborised cluster to implement security.
- Assisted with performance tuning and monitoring.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Supported code/design analysis, strategy development and project planning.
- Assisted with data capacity planning and node forecasting.
- Collaborated with teh infrastructure, network, database, application and BI teams to ensure data quality and availability.
- Work with user to resolve issues related to access and jobs running on cluster.
- Work with vendor to resolve product related issues.
- Security design and Implementation.
- User management, quota management, cluster management.
Confidential
Hadoop Adminstrator
Responsibilities:
- Plan, Implement and design architrcture of cluster
- Installed and configured MapReduce, HIVE and teh HDFS; implemented CDH4 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Created MongoDB NoSQL database and a variety of portfolios.
- Supported code/design analysis, strategy development and project planning.
- Assisted with data capacity planning and node forecasting.
- Collaborated with teh infrastructure, network, database, application and BI teams to ensure data quality and availability.
- Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.
- Cluster Installation, configuration and administration for Lab, Staging and Production environment
- Install, configure and use eco system tools (Hive, PIG, Mahout)
- Apply patches, patch and build eco system tools
- Setup NOSQL databases (HBase, Mongo DB)
- Work with user to resolve issues
- Work with vendor to resolve product related issues
- Security design and Implementation
- User management, quota management, cluster management
Confidential
Team Lead
Responsibilities:
- Plan, Implement and manage multiple parallel projects involving multiple technology and domain platforms. (Mainframe and Java Applications).
- Handling daily activities of several Wipro consultants, CVS FTE and several vendors Consultants.
- Maintained and worked with offshore teams of 15 people over 1 year period.
- Involved in migration to FACETS and Implementation, involved in end to end testing of FACETS Billing, Enrolment, Claim Processing and Subscriber/Member module.
- Used FACETS Analytics for fast and easy retrieval, display and grouping of information for performing queries and generating reports.
- Configured FACETS to adhere to customers work flow for claims processing, claims automation and group administration and also claims pricing and adjudication related to specific edits and benefits. Hands on experience with different type of claims- Inpatient, Outpatient and Professional claims.
- Configured member and group level benefits and contract configuration in FACETS.
- Worked on Member Management, Eligibility, Claims and Billing modules within FACETS.
- Managed Care, Provider & Member enrolment, PBM, Facets and eligibility updates as per teh PPACA (Patient protection and Affordable Care Act).
- Ensure consistent and integrated implementation of services initiatives across customer segments and business units. Deliver teh total business solution on time and within budget.
- Conduct weekly risk, issues and status reporting with customers and IT team leads. Fully accountable for complex/ diverse projects with a high degree of business risk.
- Requirements gathering with clients - Attend requirements kick off meeting.
- Estimation using SMC technique - Top down and bottom up approach. Functional point estimation.
- Strategize to create savings opportunities and reposition health care for teh future of our members.
- Project planning using MPP & capturing quality metrics like QC for all teh projects.
- Quality Management Plan, Business Impact analysis, business risk analysis and BMP planning
- Created various VBA excel automation tools to calculate quality metrics and project health
- My tasks included understand Requirements and Functional Specs and to come up with High Level Design. My other tasks included estimating teh requirements and understanding impact on existing System.
- Part of team who halped in successfully migrating mainframe based system (MBEST) to FACETS in CVS.
- Cross walked all relevant information from MBEST to FACETS during migration and halped in migrating 4 million members of MBEST to FACETS.
- Helped in creating Pre enrolment system Fazal which configures input data to be compatible for FACETS enrolment system.
- Worked on Member Management, Enrolment, Eligibility, Claims and Billing modules within FACETS.
- Lead a team of 10 IT engineers to provide quality assurance.
- Helped in making system complying as per MedD rules.
- Interact with business to strategies implementation and finalizing requirements.
Prior PBM
Confidential
Responsibilities:
- Display only authorized prescriber on screen and whose has complete information in our system to make order process successfully without any conflicts.
- Work with Java team to coordinate with back end (Mainframe) and front end processing (Java).
- Very carefully manage client sensitive data and follow appropriate account reconciliation with multiple teams and vendors.
- Implement a Non-payment queue by using MQ call from one mainframe system to another mainframe system.
- A high risk project that was executed within stringent time lines.
- Planned and expected various automation and small modification projects up to 500 ma hours.
- Manage pending transactions (delete, view, and update).
- Add new CICS screens to view and update pending transactions. Another part of release included cancellation of disenrollment and cancels teh request of cancellation of disenrollment on teh same day it was submitted.
- Accountable for TEMPEffective Risk and Issues communication and mitigating them.
- Add and update new contract id, premium through CICS screen meant for Caremark production support.
- First project in headstrong with a high visibility up to top management involving client, business analysts.
Business Domain
Confidential
Responsibilities:
- Working on a system based on Mainframes to process teh payroll of teh different clients. It’s a conversion project in which teh system was earlier based on teh database DATACOM which is being converted in DB2. It was divided in 3 releases and lasted more TEMPthan a year and half.
- Modify and Enhance COBOL programs to access DB2 instead of Datacom database.
- Mainly part of team working of batch portion which had more TEMPthan 100 complex programs to be converted to access DB2.
- Also halped other teams on converting more TEMPthan 1000 CICS modules DB2 access in less TEMPthan year and half.
- Take part in Defect Prevention techniques.
- Implement new techniques in a COBOL DB2 program which significantly reduced total DB2 usage time.
- Developed several REXX tools to reduce teh effort.
- Worked in a team of more TEMPthan 30 people and still managed to get client distinction for my work and several recognitions.
- My responsibilities included reviewing, mentoring 6 member team, to impart training on domain and technical to new team members and defect prevention activities.
- My tasks included understand Requirements and Functional Specs and to come up with High Level Design. My other tasks included estimating teh requirements and understanding impact on existing System.
- Dedicated accumulation fields exist on teh Employee Master file (EMP) for most values that are required for accurate calculations and/or quarter or annual tax reports (gross, taxes, special compensation, etc.). Additionally, Special Accumulator fields are available to store various values. Before dis project they are of a limited number and after dis project it TEMPhas been made virtually unlimited for teh storage of Special instructions and Tax options.
- Modify and Enhance COBOL programs as per teh project.