Bigdata Devops/hadoop Administrator Resume
Sfo, CA
SUMMARY
- Around 12+ years of IT experience in requirement analysis, design, development, implementation, support and co - ordination of web based applications
- 3+ yearsof solid Hadoop Administrationexperience building, operationalizing and managing small to medium hadoop clusters using distributions like CDH 5.x,4.x, HDP2.x, Pivotal HD 2.x, 1.x, AWS
- 3+ years of OracleSOA11g/10gdevelopment/administrator experience in SOA, BPEL, OSB, BAM
- 6+ yearsof Java/J2ee technologies expertise
- Expertise building Cloudera, Hortonworkshadoop clusters on Amazon EC2
- Experience building Apache hadoop clusters on Google Cloud.
- Experience Capacity Planning, validating hardware and software requirements, building and configuring small, medium size clusters, smoke testing, managing and performance tuning the hadoop clusters
- Upgrading (Rolling, Express) expertiseforAmbari managedHDPclusters
- Configured Rack Awareness on HDP clusters
- Expertise in usingDistCp utility toolforterabytes of data transfer between the secured and unsecuredclusters
- Successfully implemented Apache RangerandusedSentry which is the Centralized Authorization framework for security,auditing and management of hadoop clusters
- Enabled HDFS, Hive, HBASE plugins in Ranger and defined policies for core services as part of Ranger Policy Management
- Expertise in implementing KerberosSecurity to Hadoop clusters
- Worked on a POC for streaming andanalyzing Twitter datausing Apache Flume to Hadoop cluster
- Expertise using Apache Spark (fast engine for large-scale data processing)andShark(fast hive SQL on Spark)
- Worked on a POC for performance evaluation of using Apache Spark/Shark against Apache Hive
- Expertise usingTalend, Pentaho, Sqoopfor ETL operations
- Expertise working on the use cases related to Data Ingestion, Data Streaming, SensorData using large data sets
- Expertise in usingSqoopto import and export data from external systems to HDFS
- Experience in using Map R Control System which is a platform for providing the dashboard view of the health of the cluster Confidential a glance
- Expertise in developing and deploying OSB services and BPEL Processes using technology adapters (DB Adapter, File Adapter, JMS Adapter), Fault handling (catch and catchall), Messaging and Alerts, Transformations and Sensors
- Expertise in the use of WLST scripting to compile, deploy, list SOA composites and OSB services
- Expertise in taking Offline and Online backups for SOAruntime, application, metadata, config data
- Expertise in SOA performance tuning by following Purging strategies
TECHNICAL SKILLS
Hadoop distributions: Cloudera (CDH), Hortonworks (HDP), PHD (Pivotal HD) OracleSOASuite11g/10g,AIA2.5, O2B PIP Oracle BPEL,ESB,OWSM,EDN,BusinessRules,ActiveVOS,Mediator,Jdeveloper,EM,BAM,OSB11
Servers: WebLogic 8.1/9.2/10.3, Tomcat4
IDE: Eclipse, Jdeveloper
Messaging Middleware: JMS (Weblogic)
Databases: Oracle XE, Oracle Database 10g, DB2
Other Tools & Versioning systems: TOAD, CVS, VSS, Clear Case
Operating Systems: UNIX, Linux, Ubuntu, Windows NT 4.0, Windows 2000/XP
PROFESSIONAL EXPERIENCE
Confidential - SFO, CA
BigData DevOps/Hadoop Administrator
Responsibilities:
- Expertise in Capacity Planning, Configuration,Operationalizing,Management small to medium sized BIGDATA Hadoop Clusters
- Expertise building 1PB 80node&175TB 20 nodeHDPHadoop clusters using Ambari
- Expertise supporting, administering ~450 nodes BDCOEHDPHadoop cluster using Ambari
- Expertise defining Hardware and Software prerequisites for small to medium hadoop clusters
- Experience in Performance tuning the hadoop cluster components: HDFS,MapReduce2,YARN,Hive,HBase,Zookeeper
- Expertise in Cluster Upgradeexperience (HDP 2.2.4.2, to HDP 2.3, Ambari 2.0 to Ambari 2.2.1.x)
- Extensive bash scripting experience to backup and monitor Postgres: Ambari and MySQL: Hive, Oozie, Ranger databases
- Experience upgrading the clusters using Rolling Upgrade and Express Upgrade
- Expertise in the usage of Kafka,Spark and Storm for BECOE Adworks data sets
- Monitor Hadoop cluster job performance and capacity planning
- Experience working with the business team in understanding and implementing variousBIGDATA hadoop use casesand POC’s
- Expertisediagnosing, troubleshooting and fixing the hadoop infrastructure issues
- Configured Rack Awareness on HDP clusters
- Expertise in using DistCpto data transfer tera bytes of data between the secured and unsecuredclusters
Confidential - San Ramon, CA
BigDataHadoop Admin Consultant
Responsibilities:
- Installation,Configuration and Management of GreenplumPivotal Hadoop cluster
- Involved in the User Creation and Management in IBD platform for various Confidential ’s business units.
- Used Pivotal Hadoop for Data Ingestion into Hawq using Sqoop,Hive,Hive2,PXF,GemfireXD
- Expertise in the validation of Hive, HBase, Sqoop, Flume, Oozie jobs
- Expertise in the usage of Kerberos Security deployed on Hadoop Cluster
- Enabled Namenode and Pivotal Cluster high availability in IBD environments.
- Expertise using ETLSystems as the Staging Systems for User Management,Data Loading, Data Analytics, Hadoop Cluster access for end users
- Implemented enforcing the web Userauthentication for Namenode,Ooziewebconsoles
- Coordinate with operational Hadoopsupport team.
- Manage and Review Hadoop log files.
- UsedGitLab UI with Puppet as the platform to manage the Hadoop users
- Defined and implemented the logging retention period for Hadoop components
- Attended hands onGreenplumdatabase training
- Expertise in the usage of Pivotal Hawqwhich is parallel SQL processing enginerendering Hadoop queries faster than any Hadoop based query interface
- Address the performance tuning of Hadoop ETL processes against very large data set work directly with statistically on implementing solutions involving predictive analytics.
- Develop Hadoop monitoring processes (capacity, performance, consistency) to assure processing issues are identified and resolved swiftly.
- Instrumental in setting the GemfireXD cluster environment in distributed mode
Confidential - SFO,CA
HadoopConsultant
Responsibilities:
- Gathered the requirements from the management and from the target of each module and analyzed the requirements to develop the initial high-level design.
- Installing Hadoop Cluster
- Writing Map/Reduceapplications
- Load the data from Teradata tables into Hive Tables
- Testing the performance of Hadoop/Hive against Teradata
- Creation of Table schema in Hive
- Participating in client calls for Design, Code and Test Cases walkthrough
- Design and build robust Hadoop solutions for Big data problems.
- Deep understanding and related experience with Hadoop, HBase, Hive, Pig and Map/Reduce.
- Implement ETL processes with Hadoop, MapReduce, Java, Pig, Hive
- Works with application teams to install operating system and Hadoopupdates, versionupgrades as required.
- Coordinate with operational Hadoopsupport team.
- Manage and Review Hadoop log files.
- Address the performance tuning of Hadoop ETL processes against very large data set work directly with statistically on implementing solutions involving predictive analytics.
- Develop Hadoop monitoring processes (capacity, performance, consistency) to assure processing issues are identified and resolved swiftly.
Confidential - Houston, Texas
Java SOA consultant
Responsibilities:
- Involved in using OSB 11g, SOA 11g and BAM 11g components of Oracle SOA Suite 11g for creating SOAwebservices for PyxisMobile
- Involved in usage of SOA 11g for building business logic and using DB, File, FTP,JMS adapters for invoking DB stored procedures in IMS back end application, reading and writing files, using FTP for inserting images in IMS backend application, using JMS for auditing messages between OSB and SOA
- Involvedin defining Service Virtualization for Proxy Service in OSB
- Involved in using Direct Binding for OSB to SOA communication
- Involved in defining errorhandling in OSB and SOA by defining OSB and SOA specific error codes
- Involved indefining Auditing in OSB and SOA
- Involved in using test console in OSB for testing Proxy message flow and Business Service
- Extensively used SOAP UI for validating SOA services
- Defined and executed Configuration plan for environment specific properties
- Expertise in using WLST(Web Logic Scripting Tool) and Ant Scripts for SOA composites forautomating SOA composite deployment
- Involved in designing and architecting SOA recovery steps in a Disaster
- Expertise working in a clustered SOA environment
- Expertise in developing and deploying BPEL Processes using technology adapters (DB Adapter, File Adapter, JMS Adapter), Fault handling(catch and catchall), Messaging and Alerts, Transformations and Sensors
- Expertise in taking Offline and Online backups for SOAruntime, application, metadata, Config data
- Expertise in SOA performance tuning by following Purging strategies
- Expertise in designing High Availability architecture for SOA and Webtier components in a Clustered environment
- Expertise in File, FTP,JMS,DBadapter configurationinWeblogic Server admin
- Extensively used XSLTtransformation in SOA and Xquery transformation in OSB.
- Involved in creating Service Account for Proxy and Business service authentication in OSB11g project
- Expertise in the use of WLST scripting to compile, deploy, list and administer composites
- Expertisein use of node manager tomanage SOA admin, managed servers
Environment: Oracle SOA Suite 11g, Web Logic Application Server10g, SOAP UI, Oracle, UNIX, EM,BAM 11g,Jdeveloper11g, OSB11g,Toad
Confidential, Sprint, VA
Java SOA & AIA PIP Consultant
Responsibilities:
- Involved in creating delta Tech Designs based on the Functional Designs
- Involved in the usage of AIA SalesOrderEBO
- Involved in the usage of SalesOrderEBM, ProcessSalesOrderFulfillmentEBM and CreateShipmentRequestResponseEBM
- Involved in the usage of data mapping between the Siebel Payload and ProcessSalesOrderFulfillementEBM
- Involved in analyzing and datamapping between Siebel(Source),SOA and NOE(Target) systems
- Involved in designing and building the Contract Activation and DeviceUpdate flows
- Involved in integrating the Submit Order interfaces with other interfaces
- Involved in the usage of DVM lookup for referencing the Domain Value Maps
- Involved in the usage of configuring producer and consumerQueues in OC4J and Weblogicserver
- Involved in the usage of creating JMS services in BPEL process for Writing and Listening to the messages in the Queues
- Involved in the usage of Hermes JMS tool to test the JMS Queues
- Involved in end to end testing to validate the order progression from Source to Target systems
- Involved in using OWSM for creating Users & Groups and attached/detachedpolicies in EM Console for composite services using user-name token, saml token mechanisms and thus making SOA layered composite application more securable and less complex
- Used to Test the developed BPEL processes using WebService Client tools and SOAP UI
- Developed custom XPath functions to transform messages.
- Implemented ExceptionHandling for the BPEL Process models to handle System, Business exceptions and used Compensation handlers to handle the transaction if an exception occurs.
- Involved in creating Proxy and Business services using OSB for building SOA 11g interface
- Involved in using MFL(Machine Format Language) with Proxy and Business service for file transfer
- Involved in creating Service Account for Proxy and Business service authentication in OSB11g project
- Involved in designing and architecting SOA recovery steps in a Disaster
- Expertise working in a clustered SOA environment
- Expertise in developing and deploying BPEL Processes using technology adapters (DB Adapter, File Adapter, JMS Adapter), Fault handling(catch and catchall), Messaging and Alerts, Transformations and Sensors
- Expertise in taking Offline and Online backups for SOAruntime, application, metadata, Config data
- Expertise in SOA performance tuning by following Purging strategies
- Expertise in designing High Availability architecture for SOA and Webtier components in a Clustered environment
- Expertise in the use of WLST scripting to compile, deploy, list composites
- Expertisein use of node manager tomanage SOA admin, managed servers
Environment: AIA 2.5,O2B PIP,Oracle SOA Suite 11g/10g,WebLogic Application Server10g, SOAP UI, Oracle, UNIX, Mediator, EM,BAM,BusinessRules,EDN,Jdeveloper10g/11g, OSB11g
Confidential, Atlanta
Java Consultant
Responsibilities:
- Responsible for Requirements gathering, Functional specs, Use case specs and System design.
- Developed the application using the Spring MVCframe work. Involved in creating or updating the servlet.xml, application-context files, writing HandlerMappings, Interceptors, controllers and handlers, view resolversetc….
- .Installed and configured the high availability setup for Oracle ESB. This includes setting up Oracle application servers and Apache web servers in a cluster, installing the runtime and repository ESB components and configuring the web servers with a load balancer. The high-availability implementation is done on Linux servers..
- Leading the team of 4. It was the offshore and onsite project, Confidential offshore there were 4 people, and I was the lead for the team Confidential onsite. Guiding the team with changes, involving them in the meetings and discussions with the client.
- Build process with ANT framework to build and deploy the application. Used shell script and perl scripts for different batch jobs.
- Created and deployed EJB’s to process business logic and interfaced them with Servlets.
- Involved in Ajax implementation for the simple page updates of this application instead of complete page refresh.
- Used ActiveVOS tool from ActiveEndpoints for BPEL Workflows
- Involved in using ActiveVOS tool for creating, managing, deploying, auditing and monitoring processes
- Using ActiveVOS Designer as the development environment
- Used to Test the developed BPEL processes using Webservice Client tools and SOAP UI
- Involved in production support working on Telcordia’s Exact and IDIS systems
- Developed custom XPath functions to transform messages.
- Involved in the development of BAMdashboards for capturing various metrics of various applications
- Implemented ExceptionHandling for the BPEL Process models to handle System, Business exceptions and used Compensation handlers to handle the transaction if an exception occurs.
Environment: Java, Spring MVC, JSP, Servlets, JMS, Spring, Hibernate, SQL, LDAP, Oracle, Websphere6, Eclipse, Ant, CVS, Windows NT and UNIX.
Confidential
Oracle SOA Engineer
Responsibilities:
- Developed different SOAP services and Web Services using WSDL, SOAP, AXIS, and Oracle JDeveloper.
- Developed custom XPath functions to transform messages
- Used JDeveloperIDE for coding BPEL Process Models.
- Implemented ExceptionHandling for the BPEL Process models to handle System, Business exceptions and used Compensation handlers to handle the transactions if an exception occurs
- Analyzed the business needs for the processes and modules that needed migration
- Design and Implemented Adapters for backward compatibility with old modules
- Implement old processes into new modules and make sure all the business needs are addressed.
- Design, development and deployment of Web Services was done using OracleBPEL, JAVA, and XML (WSDLs)
- Assisted in training others on BPEL and determining how to integrate BPEL into the Common Services projects.
- Extensively used BPEL to determine dynamic partner link calls, updating databases via BPEL, generating proxy stub classes to call BPEL process from a web application
Environment: Oracle SOA Suite (BPEL, BAM, ESB, Web service manager),InformaticActiveVoS, Oracle Apps Server, JDeveloper, Linux, Siebel Adapter, ClearCase.
Confidential
TouchPoint Developer
Responsibilities:
- Involved in SDLC.
- Involved in defect fixeds with respect to TouchPoint application
- Involved in the SRD(Software Requirement Specification)reviews
- Involved in base lining(final draft) of SRD
- Involved in preparing and executing UT(Unit Test) and IT(Integration Test)plans
- Involved in executing the UT
- Involved in implementing the development by identifying the Use-Cases from SRD matching the Use-Cases with matching Prototypes from the Design document and finally identifying the fields and parameters from the API (Application Programming Interface).
Environment: TouchPoint14.0,Apache Tomcat 6.0.18, Apache Ant 1.7, Eclipse 3.2, JSP, JSF, JS, Struts1.0, Hibernate, XML, XSLT, AJAX My SQL 5.0 Windows XP, SVN, Tortoise SVN