We provide IT Staff Augmentation Services!

Sr. Cyber Security Manager Resume

Plano, TexaS

SUMMARY

  • 13 plus years of hands on experience in IT industry in Banking, Cyber security, retail and firmware projects as QA and Development Lead.
  • Experience leading engagements varying from 15 to 162 resources and 1.7 million per annum to 1.7 million per month.
  • Confidential PLCP certified Project Lead with experience in planning, execution, issue/risk analysis and management, staffing/ resource management, budgeting and forecasting.
  • Experience delivering projects leveraging global (onshore - offshore model) and multi-vendor teams on Waterfall and Agile methodologies.
  • Experience working in matrix organizations, developing IT strategy, managing client/ stakeholders relationships, conflict resolution, contract negotiation and management, process improvement and roll out, SDLC (including agile methodology), project lifecycle management, change management, incident management (ITIL methodology), release management, business continuity planning, creating SOWs, involvement in responding to and managing proposal process.
  • Heavily involved in practice development and maturity improvement, creating marketing material and responding to and managing RFPs
  • Set up the release management processes - Incident management, Quality Control and Change management for control room domain
  • Proficient in all phases and various methodologies of Software Development Life Cycle (SDLC) and Software Testing Life cycle (STLC) using Agile and Waterfall. Project planning by understanding the needs of the client and interpreting their requirements into effective solutions.
  • Application design, development, production support & maintenance projects which includes 3+ years of experience in Hadoop stack, HDFS, Map Reduce, Sqoop, Pig, Hive, HBase and Spark.
  • 5 plus years of experience in developing UI and Fax related components for HP printers in C/C++ and core java.
  • 2 plus years’ experience in developing the data masking tool using Java/J2EE, Spring, Hibernate and Hazel cast.
  • Extensive experience in creating and monitoring test plans, test scenarios, test cases, test reports, status reports and other documentation for Manual testing and execution.
  • Expertise in validation of report functionality developed, by writing complex SQLs.
  • Excellent SQL skills, expertise in writing SQL queries, PL/SQL Stored Procedures, Functions and Packages for business needs and Unix scripting
  • Knowledge and experience of working under on data centric projects including Big Data Testing (Hadoop), DW/ETL Testing
  • Experience in importing and exporting data using Sqoop from HDFS/Hive to Relational Database Systems and vice - versa.
  • Experienced and well versed in writing and using UDFs in both Hive and PIG using Java.
  • Very good at SQL, data analysis, unit testing, debugging data quality issues.
  • Experience in designing and developing POCs in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle.
  • Experience in manipulating/analyzing large datasets and finding patterns and insights within structured and unstructured data.
  • Developed Scala scripts, UDFs using both Data frames/SQL and RDD/MapReduce in Spark for Data Aggregation, queries and writing data back into OLTP system through Sqoop.
  • Worked on all aspects of software development lifecycle (SDLC) - requirements gathering, creating technical and functional design documents, systems and application analysis, developing and unit / integration testing of modules. Worked closely with business and technical users during design and development.
  • Experience in writing the Restful Web services.
  • Experience in developing and deploying J2EE components on application servers such as BEA Web logic Server 8.1/9.0, Apache Tomcat 5.5/6.0 and JBoss.
  • Strong experience in Configuration tools like Clearcase, GIT, CVS.
  • Strong Experience in Relational database like Oracle, Informatica, Mysql, SQLServer.
  • Good Experience in Finance and Retail.
  • Knowledge in defining/applying the Data masking, following data provisioning strategy and building Gold copy repository
  • Define and document Data Testing Strategies and test plans- responsible for support test environment and data needs for Unit as well as functional testing during the course of this delivery
  • Strict adherence to environment and data related SLAs
  • Well versed in python 3.7

TECHNICAL SKILLS

Programming: C, C++, Java/J2EE, JDBC, and Visual C++, JDBC, EJB, JSP, Servlets, JMSBig Data Big Data, Data Analysis, Hadoop, Hive, and MapReduce

Operating Systems: All Microsoft OS, HP ALM, Linux, Solaris, and Unix

Databases: DB2, Informix, MS SQL, MySQL, Oracle DB, and SQL Server

Process Management: Agile, QA / Quality Assurance, Scrum, SDLC and Waterfall

Special Skills: Microsoft Office

Frameworks/Tools: Agile Testing, Cucumber, Database Testing, ETL Testing, Manual Testing, QTP, Quality Center (ALM), and Test Management, TOAD, SQLDeveloper

RDBMS: Oracle8i/9i/10g,SQLServer2000/2005,Mysql,DB2

Web Technologies: HTML, XML, CSS, Angular JS

Big Data/Hadoop Tools: Pig, Hive, MapReduce, Spark and Kafka

PROFESSIONAL EXPERIENCE

Confidential, Plano, Texas

Sr. Cyber Security Manager

Responsibilities:

  • Portfolio planning and forecasting- Tracking and managing project financials and streamlining the process flow of data capture and MIS for financials. Actively engaged in resource planning and forecasting.
  • Process defining: Continuously defining and improving existing processes for Portfolio and project level reporting. Streamlined project level reporting, financial reporting and business reporting at a portfolio level.
  • Executive presentations: Create & manage presentations for various executive meetings by liaising with several groups and functions.
  • Resource Management: Hiring and Tracking open Project Manager and Business Analyst positions and resource movements for America and EMEA regions.
  • Vendor Management: Liaison for Strategic Vendors in Americas, EMEA & APAC regions which involves hiring, onboarding and reviewing SOWs.
  • Tracking: Maintain auditory and regulatory deliverables and deadlines. Manage other portfolio level deliverables such as IT Risks, Non Conformance (NCs) per projects, deployments per project, travel plans, vacation plans and training plans, etc.
  • PPM Tool: Extensive knowledge of CA Clarity OnePPM tool in Project and portfolio management capabilities
  • Release management: Established the release management process for the Control Room domain.
  • Support for Security Operations activity mainly in CISCO IronPort Web Proxy and McAfee ePO (OAS, Active protection, Exploit Prevention & Policy implementation etc.
  • Interface for new technical requirements
  • Perform requirements analysis and identify customer's business requirements. Provide recommendations in key strategic areas and new enhancements in security operational process.
  • Conversion of technical requirements into SOP''''s, specifications to be deployed on the respective security technologies
  • Handling web proxy and end point security related issue which comes either through ITSM or service catalog request.
  • McAfee ePO health checks to get same version level, in-dept diagnostics, fine tuning and enhancements.
  • On boarding Imperva agents for different databases like MySQL, Oracle, Big Data etc. Also moving agents from one MX to different MXs. SOP standardization, fine tuning and enhancements.
  • IDS Device administration, deploying user define signatures.
  • Perform discovery, security tool integrations, complex issues troubleshooting and security issue remediation for the endpoints
  • Engage with cross functional IT teams to understand and assess infrastructure and process requirements for sec
  • Ops service requests.
  • Report creating and fine tuning for analysis. Provide regulatory compliance reports
  • Coordinating weekly team meetings with internal and external stakeholders to discuss about project activities and deliverables.
  • Solution upkeep, Configuration backup, vendor management, monitor the device performance, updates/signatures/.
  • DAT pushing to devices
  • Security Operations support for security devices such as antivirus, web proxy, IDS, endpoints and DLPs
  • Create change requests, Implement minor changes, Hardware break fix, Software break fix, Knowledge base updates
  • Operational SLA reporting, capacity usage reporting.
  • Design, develop and deploy BigData applications. Work with Hadoop echo system like Map reduce, HDFS, Hive, Sqoop, Linux.
  • Compliance/Metrics Reporting
  • Collect and Submit monthly quality Control(QC) evidence and yearly quality assurance(QA) evidence for application,
  • Boundary, Endpoint and Data Operations process.
  • Collect and report weekly metrics, including: Coverage, Efficiency, Utilization, Uptime, Incident/Ticketing, Change Control
  • Track risk extension remediation.
  • Track and fulfill audit evidence requests.
  • Track vulnerability remediation
  • Create WSR and MSR reports

Confidential, NYC, New York

Program/Engagement Manager

Responsibilities:

  • Estimated testing efforts for team and provided direction for the design, development and execution of test strategies in different areas of testing like Environmental, Tools and Functional testing.
  • Designed the QA Dashboard concept that is widely used by the whole engineering team to track new feature developments, agile backlog tasks, prioritization of bug fixing as well as bug verification tasks.
  • Developed strategy, approach & prioritized testing needs based on given resources & expertise to perform various tests.
  • Acted as a liaison within engineering to work closely with other members of the product team to troubleshoot, debug, and resolve a variety of diverse requirement, technical, data generation and test coverage issues.
  • Attended daily scrum, sprint planning and weekly PA and QA team status meetings to monitor the progress and help the team member achieve the deadline.
  • Monitored the various training requirements of the team members
  • Led various business and technical requirements review sessions with product analysts and the technical design leads
  • Automated the test processes to reduce the times to complete testing dramatically, usually down from weeks to several hours.
  • Established standardized process among the PA and QA for effective tractability and release management.
  • Worked closely with the other teams to effectively use their products for test data generation and verification and validation process.
  • Define, manage and maintain the architecture governance process, system and workflow for big data solutions globally
  • Coordinate architecture governance review for all approved big data related projects at the commencement of project development, during project execution and after project implementation
  • Ensure that any architecture issues have adequate support and timely resolution
  • Effectively and clearly communicate architecture review status and articulate business impacts to key stakeholders
  • Involved in processing the daily files received from upstream into HDFS using Hive.
  • Involved in migrating Hive to Spark streaming using Scala.
  • Implemented Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data.
  • Involve in creating Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
  • Support code/design analysis, strategy development and project planning.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.
  • Worked extensively with Sqoop for importing metadata from Oracle.
  • Following Devops and use Jenkins for continuous Integration and Continues Development.
  • Estimated testing efforts for team and provided direction for the design, development and execution of test strategies in different areas of testing like Environmental, Tools and Functional testing.
  • Designed the QA Dashboard concept that is widely used by the whole engineering team to track new feature developments, agile backlog tasks, prioritization of bug fixing as well as bug verification tasks.
  • Developed strategy, approach & prioritized testing needs based on given resources & expertise to perform various tests.
  • Acted as a liaison within engineering to work closely with other members of the product team to troubleshoot, debug, and resolve a variety of diverse requirement, technical, data generation and test coverage issues.
  • Attended daily scrum, sprint planning and weekly PA and QA team status meetings to monitor the progress and help the team member achieve the deadline.
  • Monitored the various training requirements of the team members
  • Led various business and technical requirements review sessions with product analysts and the technical design leads
  • Automated the test processes to reduce the times to complete testing dramatically, usually down from weeks to several hours.
  • Developed python 3.0 scripts
  • Established standardized process among the PA and QA for effective tractability and release management.
  • Worked closely with the other teams to effectively use their products for test data generation and verification and validation process.

Confidential

Onsite Manager

Responsibilities:

  • On boarding of resources: Hiring Java/J2EE, Mainframe, QA resources from internal, hiring and RCE for the Pershing account based on the staffing management plan ascertain the staff acquisition, route them to the required projects and manage all the necessary logistics.
  • Accrual Reporting: Monthly generation of accrual reports and submission of the same to senior management. This entire process was automated using MS access forms.
  • Off boarding of resources: Releasing the resources from respective projects in such way it benefits both project and team members. Also see through the costs associated with those resources are not charged to the projects.
  • Compliance: Ensure that on/off boarded resources adhere to the Client as well Confidential policies during their work tenure in the project.
  • Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, and Sqoop
  • Configured, designed implemented and monitored Kafka cluster and connectors
  • Meet with client to understand and document needs
  • Meet with business and technical teammates to evaluate and qualify client requirements
  • Define scope of projects
  • Confirm scope of project to ensure profitability
  • Verify all personnel and tools needed available
  • Document technical and business requirements in technical roadmap plan
  • Develop task-level project plans in project management software
  • Analyze and qualify tasks with business and technical teammates
  • Identify repeatable, common tasks to construct initial template of project plans
  • Create and maintain internal and external training material
  • Lead weekly check-ins to ensure task notes updated Analyzed large amounts of data sets to determine optimal way to aggregate and report on it. Handled importing of data from various data sources, performed transformations using Hive, MapReduce
  • Helped with the sizing and performance tuning of the Cassandra cluster
  • Extracted the data from Teradata into HDFS using Sqoop
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping
  • Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with time and data availability
  • Optimized MapReduce code, pig scripts and performance tuning and analysis
  • Implemented advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark
  • Exported the aggregated data into Oracle using Sqoop for reporting on the Tableau dashboard
  • Involvement in design, development and testing phases of Software Development Life Cycle
  • Performed installation, updates, patches and version upgrades when required for Hadoop
  • Responsible for leading team of ETL specialists for producing non -production data using ETL concepts of Data Warehouse/Data Mart, Database (ETL & BI), Web, Client-Server Systems, High frequency Trading Systems and Applications for Investment Banking and Financial domain, Experience in the Basel II/2.5, Basel3, IHC, CCAR Banking & Securities domain (Investment Banking) for Confidential .
  • Validation of the regulatory reports as per the functionality required by the authorities after understanding the requirements and ensured that reports were compliant to the requirements.
  • Develop and performed regulatory (STAR, IHC, People soft, US-RSM, Basel-II, Basel-III and GBMR) report tests. Run the Axiom workflow and generate the report and cross validate the same according the regulatory functional and regulatory requirement.

Confidential

ETL and Hadoop development Manager

Responsibilities:

  • Developed Crawlers java ETL framework to extract data from Cerner client’s database and Ingest into HDFS & HBase for Long Term Storage.
  • Written data processing pipelines in Apache Crunch to standardize and normalize the data and store the normalized data in HBase.
  • Create ETL Pipelines using Apache Crunch to read data from HBase and calculate the KPI metrics and store the data in HDFS, and Bulk load data into HBase.
  • Experience in ETL pipeline from HDFS to Vertica.
  • Working on migrating all the Batch ETL Crunch pipelines to in-memory Spark Pipelines for quicker run times of the jobs.
  • Created Oozie workflows to manage the execution of the crunch jobs and vertical pipelines.
  • Cluster tuning to improve performance.
  • Create Integration tests to check the validity of the data being crawled.
  • Implementing new drill to detail dimensions into the data pipeline upon on the business requirements.
  • Worked with multiple teams in resolving production issues.
  • Deployment automation via Chef.
  • Handled Prod Deployments and provided production support for fixing the defects.
  • Responsible for creating business layer and its underlying data foundation and connection layer using Information Design Tool.
  • Analyze the reporting requirements with solution designers and come up with the technical specifications for the Business Objects reports.
  • Created reports based on business requirements using SAP BO Web Intelligence and implemented sorting, filtering, ordering and labeling of reports.

Hire Now