Hadoop Technical Specialist Resume
SUMMARY
- Over 20 years of experience working with Oracle, SQL Server, Cassandra, Hive hql, and Teradata for OLTP and Data Warehousing environments such as EDL Hadoop.
- Experience working with Financial Retail Technology, Manufacturing and Service industries for major bank such as US Bank, Wells Fargo and Scotiabank.
- Experience managing all incoming data files on tenants and individual properties.
- Review data for inconsistencies or anomalies that could skew analytical results.
- Experience managing and tuning application servers. Proficient in in Linux, specifically RHEL/CentOS
- Experience Maintaining databases and conduct routine maintenance as needed to ensure data integrity.
- Experience managing and tuning multiples databases including replication sites infrastructures.
- Provide individual properties with access to essential data sets.
- Streamline data collection and analysis procedures to ensure fast access to metrics.
- Make recommendations for software, hardware and data storage upgrades and capacity planning.
- Experience managing monitoring tools such JConsole, Ganglia or Nagios, Dynatrace, Splunk, Streamfwrd, JMeter
- Cloud Native Deployments of Cassandra using Kubernetes with Stateful Sets.
- Experience working in a SaaS and/or Managed Services data center
- Excellent understanding of the Relational database management concepts.
- Familiarity with high availability and load balancing architectures
- Experience Design, Creation, maintenance, and support of database objects based on architectural designs.
- Experience with Cassandra, OpsCenter, and Teradata Viewpoint tools for monitoring session’s workload management.
- Excellent negotiation skills to build consensus to obtain effective team collaborations.
- Experience with Agile, Scrum, Kanban and Waterfall (SDLC) processes methodologies.
- Experienceperforming UAT, E2E, Exploratory and Regression testing.
- Experience with CI tools Jenkins,Git, JIRA, TestReail, HTML, XML and JSON Objects.
- Experience with COBIT - 5, SAS-70, and SOX regulations and Audit operations.
- Experience with TDD tools like Chef, Ansible, Docker, using DSL scripting and code refactoring.
- Experience with virtual environments VMWare, VCloud, Hyper-V, VirutalBox, Vagrant and Azure cloud.
- Experience with orchestration tools like Kubernetes, Docker and Vigran for continue deployment delivery.
- Database space management, performance tuning and optimization in a proactive manner.
- Experience planning and performing database schema objects migrations and capacity planning.
- Design and implement a backup & recovery policies with Disaster Recovery Plan.
- Monitoring the production systems on a daily basis to catch trouble queries, and access violations.
- Experience with Big Data Cassandra and Hadoop Ecosystem capabilities to increase operational efficiency, lowering risk, detecting fraud and monitoring cyber security in real-time.
- Experience with hdfs dfsadmin command, Racks awareness, Ambari and YARN resource manager.
- Experience with Data Ingestion Framework (DIYOTTA) and Datastage for data ingestion into the EDL.
- Experience with HDFS Rack awareness, HDFS Heterogeneous Storage (Disk/Archive Nodes), HDFS NFS Gateway
- Experience with Hadoop backup consideration and HDFS snapshots using DISTCP utility.
- Familiar with Attunity Replicate Data Ingestion and Integrating Framework.
- Familiar with Global ID Application for data profiling and data discovering.
- Quick learner and willing to adapt to new technologies with excellent problems solving.
- Experience supporting production databases in 24x7 environments.
TECHNICAL SKILLS
Databases: SlaticSearch, Teradata V2R5 14,15, Oracle 10g,11g,12c, Cassandra DSE, MS SQL Server 2k8,2k12,2k14
Operating Systems: SUSE Linux 11,RHEL 7, Ubuntu 12,14, MS Windows 2k3/2k8/2012,2014, Solaris Unix, HP Unix, IBM AIX UNIX
Ecosystems: Kubernates, Docker, DataStax Apache Cassandra, Hadoop and Spark, Teradata
Teradata Utilities: BTEQ, Teradata SQL Assistant, FastLoad, MLoad, FastExport, Tpump, TPT, Exp/Imp, Teradata Administrator, Teradata Manager, PMON, VIEWPOINT, TASM, DBQL,QCD, ARCMAIN, TARAGUI
Oracle Utilities: SQLPlus, SQL* Loader, Oracle Datapum, RMAN, RAC, Data Guard, OEM, ASM, AWR, and ADDM
Cassandra Utilies: OpCenter, nodetool info, nodetool ring, nodetool cfstats, nodetool tpstats, nodetooletool netstats, nodetool drain, nodetool status
HortonWork Hadoop DP: Ambari, Big SQL Access for Hadoop, Spark, Nagios, Ganglia, Dynatrace, Splunk, Stremfrwd, JMeter
Languages: Java SE, Python, SQL, Unix/Linux Shell Scripts, UML
Methodologies: Data Warehousing Design, Data Modeling, Logical and Physical Database Design, Star Schema Snowflakes Modeling
Data Modeling: Conceptual Model, Application Model, Enterprise Model, Logical Model and Dimensional Model (Star Schema/Snowflake) for Physical Database Design.
SDLC & DevOp Tools: Agile, Scrum/Kamban/Waterfall, Chef, Ansible, JIRA, HP ALM, TestRail, Jenkin, Git, Netbean, Eclipse IDE
B/R Applications: TaraGUI, ARCmain, Avamar 6.0, 7.0, EVault 7.2, NetWorker, NetBackup, and Data Domain
Reporting Tools: Crystal Reports, Excel, SSRS, Inprontus, Cognos, Impromptus
Virtualization Tools: VMWare Lab, VMWare ESXi, Vagrant, VirtualBox, Hyper-V, Azure, Azure Cloud and VRA Lab
Network Protocols: NLB, LDAP, TCP, UDP, ICMP, NFS, iSCSI, CIFS, SMB, NAS, SAN, HDFS, GFS
Other Tools: WireShark, HeavyLoad, FIO, TPCDUMP, JMeter, Dynatrace, MS Visio, MS Office
PROFESSIONAL EXPERIENCE
Confidential
Hadoop Technical Specialist
Responsibilities:
- Day-to-day Production Support for Hadoop EDL Hortonworks Framework deployed platform and Data Ingestion DYOTTA
- Acts as Subject Matter Expert/Mentor and holds overall accountability for business requirement review, technical delivery and implementation of projects involving integration.
- Feed into the technology infrastructure roadmaps for HCOE -Hadoop Center of Excellent providing services for EDL - data lakes- Ecosystems.
- Proactively partners with business line representatives to drive improved technology solutions that support business strategy.
- Collaborating with Product Owners, Software Developers, Scrum Masters and Project Managers to deliver solutions based on business requirements.
- Participating in meetings to gather and analyze the requirements for new applications’ features, data model design and made recommendations from Cassandra database stand point.
- Cloud Native Deployments of Cassandra using Kubernetes with Stateful Sets.
- Work together with Solutions Architect to implement bank standards.
- Setting up and configuring DataStax Cassandra multi-datacenter cluster in Azure and VRA environments.
- Proactively design, cost, business case and strategize solutions to propose to the department and business line management.
- Managing, designing, supporting, best practices of storage management, archives, restores, and disaster recovery.
- Tuning up Cassandra multi-nodes cluster parameters configuration and database that improved 85% performance of transaction processing per seconds.
- Participate in Architecture forums, Research and recommend enhancements to the strategic technology evolution of the organization based on new and emerging technologies
- Monitoring and optimizing Cassandra Database using Opscenter, nodetool, Jconsole, Opscenter, and NoSQL Query.
- Participates as technical or business consultant in design, development, coding, testing, and debugging new packaged solutions or significant enhancements to existing applications.
- Working with open source ecosystem products such Hadoop, Docker, Kubernates, Cassandra, Grafana, and Kirbana.
- Able of quickly troubleshooting complex problems, understanding dependencies and deducing the root cause of issues
Environment: Hadoop Horton work Data Platform EDL, Spark, Splunk, Daynatrace, Diayotta, Cassandra, RHEL, Solaris, CentOS, MS Windows Server 2014, JMeter, Oracle 12c, Elasticsearch, Kubernetes, Docker, Azure cloud, VRA Labs
Confidential
DevOps QA Software Tester Engineer II
Responsibilities:
- Collaborating to deliver the US Bank Teller Processing Application (Omni Channel Platforms) which allow financial business transactions to be completed in a dynamic and intelligent fashion using Artificial Intelligent features.
- Participating in requirements reviews story design to ensure scope and testability.
- Author clear and concise test cases to cover the entire scope of user stories.
- Execute automated and manual test cases as required using Cucumber Framework and RESTFUL API.
- Collaborating with Product Owners, Software Developers, Scrum Masters and Project Managers to deliver solutions based on business requirements.
- Participating in meetings to gather and analyze the requirements for new applications’ features, data model design and made recommendations from Cassandra database stand point.
- Cloud Native Deployments of Cassandra using Kubernetes with Stateful Sets.
- Setting up, Test & Dev. environments for Teller Transaction Processing Omni Channel application deployed in Azure cloud and VRA environments.
- Setting up and configuring DataStax Cassandra multi-datacenter cluster in Azure and VRA environments.
- Participating in architectural and design meetings. Evaluate latest trends and conduct POC’s on latest Cassandra database technologies.
- Managing, designing, supporting, best practices of storage management, archives, restores, and disaster recovery.
- Tuning up Cassandra multi-nodes cluster parameters configuration and database that improved 85% performance of transaction processing per seconds.
- Testing and verifying Omni channel Teller Processing Application Kubernates cluster deployment scripts.
- Monitoring and optimizing Cassandra Database using Opscenter, nodetool, Jconsole, Opscenter, and NoSQL Query.
- Verifying reported defects and follow through on fixes.
- Working with open source products such Docker, Kubernates, Grafana, Kibana and dashboards.
- Capable of quickly troubleshooting complex problems, understanding dependencies and deducing the root cause of issues.
Environment: US Bank Omni-channel Teller Processing Application, Cassandra DSE and Community Editions, OpsCenter, Nodetool, MS Windows Server 2014, JMeter, Oracle 12c, Elasticsearch, Kubernetes, Docker, Azure cloud, VRA Labs, Eclipse IDE, Cucumber Framework, API Tester, FIO, Dynatrace Performance tools, JIRA, Confluence, Bit bucket, CentOS 7, Oracle Linux 7
Confidential, Alpharetta, GA
Teradata DBA/Developer
Responsibilities:
- Participated in meetings to gather and analyze the requirements for new applications, data model design and changes, made recommendations from Teradata database stand point.
- Created user’s roles and access required for specific applications, and setting up profiles.
- Supported Production, Test & Dev. environments for a large enterprise data warehouse project.
- Helped the teams by providing the skew factors of their tables to avoid the Hot AMP situations.
- Performed statistics collection and index analyzing to give the recommendations quickly to developers team
- Used Teradata tools like Teradata Administrator, Teradata Manager, DBQL, TDWM, and SQL Assist.
- Worked with database activities - create role, create database, create user, create profile, increase spool space to user, increase database perm space, drop user, remove user temporarily,
- Managed, designing, supporting, best practices of storage management, archive, restore, and disaster recovery.
- Worked using ARCMAIN, TARAGUI, and NetBackup for scheduling reoccurring backup policies and maintaining general backup and restore process based on the requirements.
- Space optimizing concepts, adding compression, and ensuring optimum column definitions.
- Monitored and optimization VIEWPOINT, TASM, TWM and Visual explain to analyze the Query plans.
- Implemented UPI, NUPI, UPPI, NUPPI, USI, NUSI, skew analysis, join index data access paths.
- Supported production databases in 24x7 environment
Environment: Teradata V14.10, V15.10 Teradata SQL Assistant, BTEQ, Multi Load, Fast Load, Fast Export, Tpump, TPT, Terada Manager, PMON, Teradata Administrator, VIEWPOINT, TDWM, DBQL, ARCMAIN, TARAGUI, NetBackup, Windows Server,MS SSRS, Oracle 10g, 11g, IBM Mainframes.
Confidential
Senior QA Software Storage Tester Engineer.
Responsibilities:
- Designed,authored test planning and test case executions for Storage Server Denali.
- Performed Baseline and Benchmarks performance testing.
- Collaborated with the Team performing Regression and Exploratory UI web applications testing.
- Created automated scripts in Python using ROBOT Framework.
- Created and automated Performance test scripts using Flexible Input-output (FIO) tools
- Prepared and executed UAT, Functional and Regression Testing manual and automated using Atlassian JIRA, TestRail, Git, Bamboo, Selenium2, and Robot frameworks.
- Configured multi-node Cluster environments for QA testing processes using Vagrant, Ansible, Dockers
- Prepare weekly/daily matrix and defects status report using Test Management Tools like Atlassian JIRA, TestReail for JIRA, Git, and CI Bamboo testing environment.
- Performed Cassandra database testing for the Denali UI web application.
Environment: Ubuntu Linux, Denali Data store cluster Cassandra, Selenium web driver, ROBOT Framework, FIO, Sysbench, TPT, Linux CentOS, Bamboo, GitLab, Atlassian JIRA, TestRail, Ansible, Docker, Vagrant, and Windows Servers.
Confidential
QA Software Test Lead Engineer
Responsibilities:
- Collaborate to deliver the Bank International Sales Platform Release 1.0 to be used by 12,000 officers to serves over 4 million customers in more than 1,400Branches across 29 countries.
- Designed, authored test planning and test cases executions for CRM International Sales and mobile web applications.
- Prepared and executed UAT test using HP UFT, ALM, and ALDON for Automation and defect logging tools.
- Performed data validations using MS T-SQL, SSRS, SSI, SSDT, SQL Server 2014 Database and DIM ETL Report E.
- Worked with onsite/offshore UAT Testers to perform Report Engine testing and data validations.
- Created Test Plan, Test Cases, and Test scripts to be executed manually and automated on Selenium2, Robot F.
- Prepare weekly/daily matrix and defects status report using Test Management Tools like HP ALM/ALDOM.
- Worked with Jenkins, Git continue integration and sources version control repository tools.
- Carefully analyzed BRDs document and added 650 unique test cases into HP ALM application
Environment: International Sales Platform an integrated web desktop solution running on MicrosoftWindows 2012 Server, SQL Server 2012, SSRS and SSIS under Microsoft HA failover architecture configuration 24/7 productionApplication, CentOS Linux, Ubuntu Linux, MS T-SQL, SSI, SSDT, SQL Server 2014 Database, HP UFT, ALM, and ALDON, Selenium2 web driver, DIM/ETL Report Engine.
Confidential
Senior QA Analyst / DevOps Software Tester Engineer II
Responsibilities:
- Configured Windows 2008/2012/2014 , CentOS,RHEL Linux, DELL and, IBM AIX Servers on Physical and Virtual on the Cloud environments.
- Troubleshoot Windows Servers: 20003,2008R2,2012R2,2014R2 Servers and Windows Clients Vista, 7, 8 and 10 with Microsoft ADK and .NET framework.
- Performed Windows Assessment Services, Windows ETW trace files using MA and Heavy Load Tools.
- Provided customers support for issues reported on Oracle 10g, 11g, 12c. And SQL Server 2k8/2012/2014 Databases.
- Configured Chef Server Automation Tools creating scripts using, Cookbook, Recopies and Resources
- Configured DataStax Cassandra multi-datacener clusters deplyed in Azure and VMWare Lab environments.
- Configured and administered Cassandra Storage clusters environment with over 320TB storage size.
- Configured HAProxy Load MS NLB and F5 Load balancing for each cluster.
- TroubleshootNetwork TCP/IP, DNS, UDP, NFS, CIFS, DHCP, SMB, Netlookup, Netstat, Traceroute,Netsh.
- Prepared and executed UAT, Functional and Regression Testing manual and automated using Test TrackPro, Test Link, Jenkins and P4 continue integration build and source repository.
- Prepared and Executed Exploratory, Regression, UAT, Sanity, Functional and Performance Load Testing.
- Setup Virtual Test Environments deployed on VMware Lab Manager, VCloud and MS and Azure Clouds.
- Enabled MS Performance Counters, Task Manager, Heavy Load tool, Nagiosand JConsole, TOA, OEM
- Scripted on UNIX Korn/Bash Shell,, and Oracle SQL Statements. Good knowledge of Python and Java OOP
- Prepared and presented weekly/daily matrix defects status report.
Environment: CentOS Linx, Ubuntu Linux, Apache Cassandra, Oracle 10g, 11g, Oracle Enterprise Management, SQL Loader, Oracle RAC, Quest Software, VMWare VLab, MS Hyper-V, Jenkins, P4, Test TrackPro, TestLink, Confidential Isilon, Nexan, Fujitsu storage and Windows 2008R2, 2012R2, 2014R2 Servers.
Confidential
Technical Support Engineer II SME
Responsibilities:
- Applied proved technical expertise using standard operating tools to analyze diagnostic and resolve moderate to complex level customer’s system issues that negatively impact Confidential ’s products performance at customer sites.
- Identified, documents, and escalates customers' issues to Developers Engineer Team to produce the Hot Fix.
- Communicated procedural technical issues to internal and external customers in a fast paced environment.
- Maintained a "closed-loop" communication style assuring all appropriate individuals are notified of ongoing issues.
- Shared all acquired knowledge concerning to a problem resolution with the field and customers.
- Assisted colleagues with technical requirement and train them via brown bag presentation and workshop sessions to improve their technical knowledge and skills regarding Confidential new products and functionality.
- Applied Avamar application Hot Fixes and Patches on customer hosts production environments.
- Worked with Confidential ADS platforms like GEN-3, GEN-4 single and RAIN multi-nodes cluster, Confidential NAS Celerra storage, NDMP Accelerator nodes, Confidential VNXe, VNX Unified Storage, Control Stations, Data Movers and Unisphere Application.
- Worked with VNXe, VNX Local and Remote Replications Suite, PowerPath, RecoveryPoint/SE CDP and CRR appliances SnapView, SnapSure, Snapshot and LUNs Architectures.
Environment: Oracle Ticketing System, VNX Storage, Celera Storage, Data Domain, Avamar, and Networker, Oracle Databases 10g, 11g, Windows Servers, IBM AIX 5L, Sun Solaris and HP UNIX.