Splunk Engineer Resume
Duluth, GA
SUMMARY:
- Result Oriented Confidential Engineer with 6+ years of experience in various aspects of Information and Network Security.
- Admirable correspondent with analytical, Technical Expertise, relationship management and coordination skills.
- Experienced Security Consultant with seven years of IT experience with a focus on designing and developing security solutions.
- Experienced in Architecting and deploying clustered/distributed Confidential Enterprise 6.x implementations to large, complex customers.
- Implemented and finalized Confidential infrastructure in both a lab reference environment, as well as in production.
- Experienced in implementation of Glass Tables via ITSI is created.
- Administered Confidential and Confidential Apps to include developing new/custom Apps to perform specialized functionality.
- Integrating Confidential with a wide variety of legacy data sources and industry leading commercial security tools that use various protocols.
- In depth and extensive knowledge of Confidential architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
- Worked on Security solutions (SIEM) that enable organizations to detect, respond and prevent these threats by providing valuable context and visual insights to help you make faster and smarter security decisions.
- Headed Proof - of-Concepts (POC) on Confidential implementation, mentored and guided other team members on Understanding the use case of Confidential .
- Experience in deploying Confidential across the UNIX and windows environment. Also familiar with deployment tools like Chef.
- Perform data mining and analysis, utilizing various queries and reporting methods.
- Experience in creating and Managing Confidential DB connect Identities, Database Connections, Database Inputs, Outputs, lookups, access controls.
- Proficient in using SQL Server Integration Services (SSIS) to build Data Integration and Workflow Solutions, Extract, Transform and Load (ETL) solutions for Data warehousing applications.
- Experience with Confidential technical implementation, Planning, customization, integration with big data and statistical and analytical modelling.
- Worked on various types of charts Alert settings Knowledge of app creation, user and role access permissions. Creating and managing app, Create user, role, Permissions to knowledge objects. Involved in setting up alerts for different type of errors.
- Worked extensively with complex mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
- Time chart attributes such as span, bins, Tag, Event types, Creating dashboards, reports using XML. Create dashboard from search, Scheduled searches o Inline search vs scheduled search in a dashboard.
- Expert with various search commands like stats, chart, timechart, transaction, strptime, strftime, eval, xyseries, table etc.; Experience with the usage of Extract Key Word, sed, etc..
- Experience in developing END to END planning & Implementation of Various Network Devices and Business Application with the SIEM Device -QRADAR/ Confidential .
- Expert level understanding of Qradar Implementation & its Integration with other N/W devices and Applications and the troubleshooting work.
- Expertise in Creating Scripting for Configuration Backup, Report backup, Qradar Device Reports and for Metric Generation.
- Experience in creating custom views, reporting and automated alerting for both operational and security use using Qradar.
- Experience in Security Incident handling SIEM using RSA Envision and IBM Qradar products Identifying the critical IT infrastructure that requires 24/7 monitoring.
- Strong Experience in Maintaining of network/application security, applications programming, reverse engineering, malware analysis, cryptographic algorithms, Identify targeted attacks and other suspicious activity using a variety of network based tools.
- Excellent organizational, presentation, communication, project management skills, ability to work in a team and also independently.
- Configured Confidential for all the mission critical applications and using Confidential effectively for Application troubleshooting and monitoring post go lives.
- Worked Directly with Confidential Inc sales team in determining Log size and licensing cost for the client's Infrastructure.
TECHNICAL SKILLS:
SIEM Tool: IBM Qradar, Confidential, IBM Guardium, Tripwire.
Puppet master & Puppet: Monitoring, Reporting & Troubleshooting with Puppet Master. Building Hosts & Writing ManifestsPuppet Scalability.
Operating Systems: Windows 2000, XP, Windows NT,Unix/Linux (Red Hat), VM Ware
Data Analysis: Requirement Analysis, Business Analysis, detail design, data flow diagrams, data definition table, Business Rules, data modelling, Data Warehousing, system integration
RDBMS: Oracle 11g/10g/9i/8i, MS-SQL Server 2000/2005/2008 , Sybase, DB2 MS Access.
Web Technologies: HTML, DHTML, JavaScript, XML, XSL, XSLT
Web/App Servers: Apache Tomcat 6.0, web logic8.1/9.2, web sphere 6.0
Concepts: TCP/IP, LAN/WAN, Routers, Firewalls and Firewall (ACL), IPSEC, PPTP, L2TP, Backtrack 4 R2, SNORT, OSSEC, and Tripwire, Encryption Algorithms, Digital Signature, Deploying PKI.
Programming Language: C, C++, Java, Python, UNIX shell scripts
PROFESSIONAL EXPERIENCE:
Confidential, Duluth, GA
Splunk Engineer
Responsibilities:
- Architected and deployed Confidential SIEM as part of the overall security strategy to monitory/control threats in an open environment.
- Involved in standardizing Confidential forwarder deployment, configuration and maintenance across UNIX and Windows platforms.
- Monitor Confidential Infrastructure for capacity planning and optimization.
- On boarding of new data into Confidential . Troubleshooting Confidential and optimizing performance.
- Use Confidential Enterprise Security to configure correlation search, key indicators and risk scoring framework.
- Good understanding of security threats and vulnerabilities and how to detect and mitigate them, experience in building security monitoring and incident management solutions using Confidential.
- Created Various types of charts Alert settings Knowledge of app creation, user and role access permissions.
- Have Knowledge in various search commands like stats, chart, time chart, transaction, strptime, strftime, eval, where, xyseries table etc. and difference between event stats and stats.
- Investigate and validate all actionable security events and escalate or take action as indicated in security model to mitigate threats.
- Learned about RSA’s portfolio of products with emphasis on SIEM and threat detection.
- Implemented integration between third party software and RSA software.
- Experience in working with Confidential authentication and permissions and having significant experience in supporting large scale Confidential deployments.
- Created Dashboards, Visualizations, Statistical reports, scheduled searches, alerts and also worked on creating different other knowledge objects.
- Strong experience in working with Confidential architecture and various Confidential components (indexer, forwarder, search head, deployment server), Universal and Heavy forwarder.
- Use Blade Logic to patch and install applications to several different test labs as well as operational windows server systems.
- Extensively used App Dynamics to monitor CPU, memory usage, JVM heap memory health, session and thread counts, and application log errors.
- Provide Regular support guidance to Confidential project teams on complex solution and issue resolution.
- Worked on installing Universal Forwarders and Heavy Forwarders to bring any kind of data fields in to Confidential .
- Designed and maintaining production-quality Confidential dashboards.
- Involved in admin activities and worked on inputs.conf, index.conf, props.conf and transform.conf to set up time zone and time stamp extractions, complex event transformations and whether any event breaking.
- Experience developing Confidential queries and dashboards targeted at understanding application performance and capacity analysis.
Environment: Confidential 6.4, Confidential ES, Confidential DBConnect2.0, Confidential ITSI, Confidential ITOA, D3.js, Tomcat 7.x, JBoss 7.x, BIGIP Load Balancers, SAML, Wily Introscope 6.0, Configured plug-ins for Apache HTTP server 2.4, RedHat Linux 6.x, JDBC, JDK1.7, J2EE, JSP, Servlets, XML, Oracle 11g, GI.
Confidential, Atlanta, GA
Admin/Developer
Responsibilities:
- Troubleshooting and resolve the Confidential - performance, search poling, log monitoring issues; role mapping, dashboard creation etc.
- Created Confidential Search Processing Language (SPL) queries, Reports, Alerts, and Dashboards.
- Established indexes and retention policy of buckets; developed user roles to complement operational and security utilization. Set-up common sourcetypes using pre-trained datasets and constructed sourcetypes of unique data.
- Experience in Designing and implementing second level analytics using Elastic Stack.
- Performed Elasticsearch performance and configuration tuning. Collaborated with developer team to architect, develop and optimize Kibana visualizations.
- Created Regular Expressions for Field Extractions and Field Transformations in Confidential .
- Expertise in WebLogic Application Server, Administration including installing, configuring, migrating, load balancing, deploying applications, performance tuning, upgrading, and maintenance of WebLogic Server.
- Involve in analyzing daily application volume trend, Issues, Errors, and end to end reconciliation reports. Taking immediate appropriate action in case of any business or customer impact.
- Assisted internal users of Confidential in designing & maintaining production-quality dashboard, assisted offshore team to understand the use case of business and provided technical services to projects, user requests & data queries.
- Configured services, Entities, Correlations searches with corresponding KPI metrics in Confidential ITSI Application.
- Developed KPI's associated with a service and built glass tables, Deep Dives, Notable events.
- Created and configured KPI metrics in Confidential IT Service Intelligence(ITSI).
- Using Amazon Web Services (AWS) focusing mainly on planning, monitoring, deploying and maintaining cloud infrastructure on multiple EC2 nodes and VM in Linux/Unix (Red Hat, CentOS) environment with respect to project.
- Supported HPSM, Remedy and the software distribution tool Blade Logic.
- Patching of all Windows servers using Blade Logic and MSPatchtool for ad hoc servers.
- Involved in implementing Ansible configuration management and maintaining them in several environments on AWS cloud and VMware.
- Created alarms, monitored & collected log files on AWS resources using CloudWatch on EC2 instance which generates Simple Notification Service (SNS).
- Experience in working with Confidential authentication and permissions and having significant experience in supporting large scale Confidential deployments.
- Adding Users to access Confidential through Remedy process (AD group), Confidential Authentication & Authorization configuration files.
- Creating Input stanzas and prepared server classes to push monitoring stanzas to read the data by Confidential and make them visible in UI.
- Loaded skills like understanding security policies, Change Management Process, Domain Name Service (DNS), Data & traffic analysis, identifying security events, incident response, IP addressing.
- Knowledge about Confidential architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
- Confidential configuration that involves different web application and batch, create Saved search and summary search, summary indexes.
- Worked on log parsing, complex Confidential searches, including external table lookups.
- Use techniques to optimize searches for better performance, Search time vs Index time field extraction. And understanding of configuration files, precedence and working.
- Worked on configuration files inputs.conf, indexes.conf, props.conf, serverclass.conf, transforms.conf and limit.conf.
- Upgrading and Migrating the Confidential Components and setting up the Retention Policy for the indexes.
- Configuring LDAP and Single Sign-On for User Authentication in the organization.
- Supported HTTP methods following the REST API subsets including the CURD operations like the GET, POST and DELETE to return a HTTP status code to indicate the success of the operation or cause of a failure to fulfill the request.
- Used cURL and REST client browser plugins to exercise the API by using the curl command.
Environment: Confidential 6.2, Blade Logic, Unix, Linux, SQL server, XML, Web Services, Confidential DB connect 2.2, Unix, Oracle 11g, Service Now, MS SQL Server 2012, SQL server, Python Scripting.
Confidential, Tampa, FL
Developer/Admin
Responsibilities:
- Managing the service request tickets within the phases of troubleshooting, maintenance, upgrades, fixes, patches and providing all-round technical support.
- Installation of Confidential Enterprise, Confidential forwarded, Confidential Indexer, Apps in multiple servers (Windows and Linux) with automation.
- Experienced in Preparing, arranging, and testing Confidential search strings and operational strings.
- Install and maintain the Confidential adds-on including the DB Connect, Active Directory LDAP for work with directory and SQL database.
- Configure the adds-on app SSO Integration for user authentication and Single Sign-on in Confidential Web.
- Configure and Install Confidential Enterprise, Agent, and Apache Server for user and role authentication and SSO.
- Manage Confidential configuration files like inputs, props, transforms, and lookups.
- Upgrading the Confidential Enterprise to 6.2.3 and security patching. Deploy, configure and maintain Confidential forwarder in different platforms. Ensuring that the application website is up and available to the users.
- Continuous monitoring of the alerts received through mails to check if all the application servers and web servers are up.
- Create Confidential Search Processing Language (SPL) queries, Reports, Alerts and Dashboards, Worked on various defects analysis and fixed them.
- Problem record analysis and solution providing Field Extraction, Using Ifx, Rex Command and Regex in configuration files.
- Creating Reports, Pivots, alerts, advance Confidential search and Visualization in Confidential enterprise. Provide power, admin access for the users and restrict their permission on files.
- Developed Confidential infrastructure and related solutions as per automation tool sets. Installed, tested and deployed monitoring solutions with Confidential services. Provided technical services to projects, user requests and data queries.
- Implemented new dashboards making use of latest Confidential functionality along with CSS and Javascript.
- Implemented forwarder configuration, search heads and indexing. Supported data source configurations and change management processes.
- Analyzed and monitored incident management and incident resolution problems. Resolved configuration based issues in coordination with infrastructure support teams. Maintained and managed assigned systems, Confidential related issues and administrators.
- Active monitoring of Jobs through alert tools and responding with certain action w.r.t to logs, analyses the logs and escalate to high level teams on critical issues.
- Worked on log parsing, complex Confidential searches, including external table lookups. Configured and administered Tomcat JDBC, JMS services.
- Designing and maintaining production-quality Confidential dashboards. Confidential configuration that involves different web application and batch, create Saved search and summary search, summary indexes.
Environment: Confidential 6.2.3 Confidential 6.x, Confidential ES, BIGIP Load Balancers, Configured plug-ins for Apache HTTP server 2.4, RedHat Linux 6.x, JSP, Servlets, XML, Oracle 11g, GIT.
Confidential
IBM Qradar Engineer
Responsibilities:
- Participated in the product selection and installation of Qradar Security Information Event Manager SIEM consisting of multiple collectors and a high-performance MS SQL and Oracle database.
- Responsible to propose rules to the client to implement into QRadar to trigger security events. Once the rules were approved, Involved to test them and implement them into QRadar.
- Involved in writing processes for the Level 1 Security Analysts about how to treat each offence, what to do when an issue would happen; or even how to configure each type of device to send their log to QRadar.
- Involved in configuring QRadar to send automatic reports using the report module of QRadar. Each report was sent to the client's different teams (Network, Database, System ).
- Global escalation point for degradation issues, as well as escalation point for addressing security events in accordance with First Data's information security polices, Incident management, Escalation Management, Incident Response Team and global CIRT team.
- Migrating existing Reports and Alerts from RSA envision to IBM Qradar.
- Aggregate, correlate and analyze log data from network devices, security devices and other key assets using Qradar.
- Created Dashboards, report, scheduled searches, and alerts, SIEM searches and alerts Metrics.
- Responsible for maintaining availability, reporting and communication of the SIEM between it, its event sources and the endpoints.
- Configured Reference Sets as White lists and Blacklists for Rules and Reports.
- Created and Run the Qradar Searches for Rules and Reports.
- Developed comprehensive security event reports to address current and potential security concerns and meet Audit Requirements.
- Managed the day-to-day log collection activities of source devices that send log data to SIEM IBM Qradar.
- Dashboard / Enterprise dashboard customization for a various team based on the log source type requirements.
- Identify current product management issues and developed a best practices process to efficiently manage the Security Information and Management tool.
- Created Scripting for Configuration Backup, Report backup, QRadar Device Reports and for Metric Generation.
- Cleaning up log sources auto-discovered in QRadar by identifying duplicates, correcting mis-identified log sources, and identifying log sources from their logs.
- Analysis of various use cases in the Qradar console like Malware, Adrelated issues.
Environment: Qradar, Redhat Linux, XML, JSON, Javascript, Oracle DB, GIT.