We provide IT Staff Augmentation Services!

Splunk Admin Resume

4.00/5 (Submit Your Rating)

Dallas, TX

PROFESSIONAL SUMMARY:

  • Around 8+ years of extensive experience in the IT industry with SPLUNK development, administration, architecture and upgrades for distributed Splunk Environments on Linux RHEL/CentOS system. Monitoring, Data Analytics performance tuning Troubleshooting and Maintenance of Data Base.
  • Around 3+ years of experience as Splunk Developer performed activities including requirement analysis, design, and implementations of various client server based applications using Splunk 5.x./ 6.x. experience in design and development of Big Data Analytics using Hadoop ecosystems related technologies experience in Spark, and Spark Streaming.
  • Subject matter expert (SME) in a team of Splunk Admins.
  • Experience in engineering and deploying analytics and SIEM SOC solutions in a large enterprise environment.
  • Expert in Scripting and development skills (BASH, Python & Java) with strong knowledge of regular expressions. Strong task management skills.
  • General networking and security knowledge (firewalls, routing, DNS, NAT, packet trace and analysis, etc.)
  • Expertise in Installation, Configuration, Migration, Trouble - Shooting and Maintenance of Splunk, Passionate about Machine data and operational Intelligence.
  • Creating accurate reports, Dashboards, Visualizations and Pivot tables for the business users. Installing and using Splunk apps for UNIX and Linux (Splunk nix).
  • Extensive experience and actively involved in Requirements gathering, Analysis, Reviews.
  • Experience on Splunk Enterprise Deployments and enabled continuous integration on as part of configuration (props.conf, Transforms.conf, Output.confg) management.
  • Experience with Splunk Searching and Reporting modules, Knowledge Objects, Administration, Add-On's, Dashboards, Clustering and Forwarder Management.
  • Created and Managed Splunk DB connect Identities, Database Connections, Database Inputs, Outputs, lookups, access controls.
  • Expertise Splunk query language and Monitored Database Connection Health by using Splunk DB connect health dashboards.
  • Collecting detailed usage of Amazon Web servers.
  • Worked on log parsing, complex Splunk searches, including external table lookups.
  • Experience on use and understand rex, Sed, erex and IFX to extract the fields from the log files.
  • Experience on Splunk data flow, components, features and product capability.
  • Experience on Splunk search construction with ability to create well-structured search queries that minimize performance impact.
  • Installed Splunk DB Connect 2.0 in Single and distributed server environments and Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing.
  • Experience implementing and delivering monitoring solutions in development, QA, and Production environments.
  • Understanding and experience with configuration management tools and concepts such as Puppet, Chef, AWS.
  • Experience in Syslog-ng configuration update without restarting syslog-ng process.
  • Experience in rsyslog TCP/UDP ports, source Ip connections with restrictions.
  • Defined KPIs in ITSI and also defined threshold values for each KPI. Created Glass Tables, Deep Dives, Multi KPI Alerts and Notable Events .
  • Created clustered and non-clustered indexes for increasing the performance, also monitored the indexes by troubleshooting any corrupt indexes by removing fragmentation from indexes.
  • Worked on the version control tools like GIT, TFS, SVN.
  • Working knowledge of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLTP and OLAP.
  • Good understanding of Views, Synonyms, Indexes, Joins, and Sub-Queries.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.

TECHNICAL SKILLS:

Splunk Modules: Splunk Modules Splunk Splunk Enterprise, Splunk DB Connect, Splunk Cloud, Hunk, Splunk on Splunk, Splunk IT Service Intelligence, Splunk App for VMware, Splunk Web Framework, AWS.

BI Tools: SAP BO, SSRS, Crystal Reports, MS Excel, Acute Reporting.

ETL Tools: Informatica, DataStage, Microsoft SQL Server Integration Services (SSIS)

Big Data: Splunk, Cassandra, Elastic search, Logstash, Hadoop, HDFS, Hive, Nagios. SIEM.

Databases: Teradata, Oracle, SQL Server, DB2, Sybase, Access.

Methodologies: Data Modeling - Logical/Physical/Dimensional, Star/Snow flake Schema, ETL, OLAP, Complete Software Development Lifecycle, Waterfall, Agile.

Scripting Languages: Python, Perl, Shell, Regular Expression, Bash, JavaScript, SQL, PL/SQL, C, C++, HTML, and XML.

PROFESSIONAL EXPERIENCE:

Confidential, DALLAS, TX

Splunk Admin

Responsibilities:

  • Installation of Splunk Enterprise, Splunk forwarded, Splunk Indexer, Apps in multiple servers (Windows and Linux) with automation.
  • As part of SIEM, monitored notable events through Splunk Enterprise Security (Using V3.0).
  • Worked on Security solutions (SIEM) that enable organizations to detect, respond and prevent these threats by providing valuable context and visual insights to help you make faster and smarter security decisions.
  • Experience with Splunk Enterprise Security (ES4) and Splunk ITSI. Knowledge of statistical modeling for anomaly, ML and outlier detection.
  • Splunk enterprise architecture, integration and deployment experience
  • Big data experience, including Kafka Connect, Storm, Spark, HDFS.
  • Knowledge of indicators of compromise (IOC) of systems and applications. Familiarity with key security events on common platforms.
  • Industry certifications such as CISSP, SANS, CeH, etc. SDLC experience, using JIRA and GIT.
  • Onboard new log sources with log analysis and parsing to enable SIEM correlation.
  • Install and maintain the Splunk adds-on including the DB Connect, Active Directory LDAP for work with directory and SQL database.
  • Configure the adds-on app SSO Integration for user authentication and Single Sign-on in Splunk Web.
  • Configure and Install Splunk Enterprise, Agent, and Apache Server for user and role authentication and SSO.
  • Manage Splunk configuration files like inputs, props, transforms, and lookups.
  • Upgrading the Splunk Enterprise to 6.2.3 and security patching. Deploy, configure and maintain Splunk forwarder in different platforms. Ensuring that the application website is up and available to the users.
  • Strong in analyzing data using HiveQL, Pig Latin, HBase and Map Reduce programs in java.
  • Expertise in Extending Hive and Pig core functionality by writing custom UDF's.
  • Expert in working with Hive data warehouse tool-creating tables, data distribution by implementing Partitioning and Bucketing, writing and optimizing the HiveQL queries.
  • Strong knowledge of Rack awareness topology in the Hadoop cluster.
  • Continuous monitoring of the alerts received through mails to check if all the application servers and web servers are up.
  • Create Splunk Search Processing Language (SPL) queries, Reports, Alerts and Dashboards, Worked on various defects analysis and fixed them.
  • Problem record analysis and solution providing Field Extraction, Using Ifx, Rex Command and Regex in configuration files.
  • Creating Reports, Pivots, alerts, advance Splunk search and Visualization in Splunk enterprise. Provide power, admin access for the users and restrict their permission on files.
  • Developed Splunk infrastructure and related solutions as per automation tool sets. Installed, tested and deployed monitoring solutions with Splunk services. Provided technical services to projects, user requests and data queries.
  • Performed Splunk administration tasks such as installing, configuring, monitoring and tuning.
  • Active monitoring of Jobs through alert tools and responding with certain action w.r.t to logs, analyses the logs and escalate to high level teams on critical issues.
  • Worked on log parsing, complex Splunk searches, including external table lookups. Configured and administered Tomcat JDBC, JMS and JNDI services.
  • Designing and maintaining production-quality Splunk dashboards. Splunk configuration that involves different web application and batch, create Saved search and summary search, summary indexes.
  • Deployed applications on multiple WebLogic Servers and maintained Load balancing, High availability and Fail over functionality.
  • Involved in monitoring the ticketing tool (service now) and taking the ownership of the tickets.
  • Created Crontab scripts for timely running jobs. Developed build scripts, UNIX shell scripts and auto deployment processes.
  • I have helped teams on-board data, create various knowledge objects, install and maintain the Splunk Apps, TAs and good knowledge on java script for advance UI as well Python for advance backend integrations .
  • Provided 24/7 on-call Production Support.

Environment: SPLUNK 6.2.3 Splunk 6.x, BIGIP Load Balancers, Configured plug-ins for Apache HTTP server 2.4, RedHat Linux 6.x,, JSP, Servlets, XML, Oracle 11g, GIT,SVN Nagios.

Confidential, Seattle, WA

Splunk Admin/ Developer.

Responsibilities:

  • Install and maintain the Splunk adds-on including the DB Connect 1, Active Directory LDAP for work with directory and SQL database.
  • Installed HTTPS certification for splunk
  • Played a major role in understanding the logs, server data and brought an insight of the data for the users.
  • Collected data from various resources. Installed forwarders, Indexers, Search Heads on the servers.
  • Field extractions for the log files, extracted complex Fields from different types of Log files using Regular Expressions.
  • Configured LDAP Develop custom app configurations (deployment-apps) within SPLUNK.
  • Managed Confluence users, permissions, user's directories. Configured all configurations required for splunk
  • Created EVAL Functions where necessary to create new field during search run time
  • Defined KPIs for ITSI, alerts and Glasstables and KPI base searches. Backup the configuration in ITSI. worked on applications likely DBconnect, Firebridage, ITSI and Add-ons
  • Worked on DB Connect configuration for Oracle, MySQL. Configured Distributed Management Console(DMC)
  • Created Dashboards for workflow purposes which help to find the root cause of the issues.
  • Created alerts, Schedule searches and Dashboards using Post Process Search in splunk
  • Developed, evaluated and documented the installation of splunk for management purpose.
  • Very good understanding of software development life-cycle (SDLC) process, Followed Agile scrum and story maps for dev tracking

Environment: Splunk, LDAP, MySQL, Linux, Bash, Perl, Hbase, Hive, Pig, Oracle 11g, MS SQL Server 2012, TFS,SVN.

Confidential, Boston, MA

Sr. Informatica Developer

Responsibilities:

  • Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
  • Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning. Extensive experience to monitor the customer volume and track the customer activity.
  • Have worked in involving capturing, analyzing and monitoring Confidential online banking application front end and middle ware applications.
  • Also, worked on code changes for various maintenance and customer reported bugs as part of production support.
  • Worked in Level 3 production support team in the bank's phone banking channel, which includes VRU (Voice Response Unit) and CTI (Computer Telephony Integration).
  • My achievements include complete revamp of the Confidential testing and production environment which include of retiring the hardware and reducing around 1 Million dollar savings in the maintenance and cost of hardware.
  • I also have hands on experience in application design, configuration management, performance tuning, code reviews, and object oriented design, release and change management experience.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Running and monitoring daily scheduled jobs by using Work Load manager for supporting EDW (Enterprise Data Warehouse) loads for History as well as incremental data.
  • Investigated failed jobs and writing SQL to debug data load issues in Production.
  • Involved in Transferring the Processed files from mainframe to target system.
  • Supported the code after postproduction deployment.
  • Interacted with the Source Team and Business to get the Validation of the data.

Environment: Informatica Power Center 9.5/9.1,, UNIX, SQL, MS Access, BO XI R2, Erwin,, Shell Scripts, Rapid SQL, PVCS, Visio, AutoSys.

Confidential, Atlanta, GA

Informatica Developer

Responsibilities:

  • Designed and developed ETL process using Informatica tool.
  • Worked with various Active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation
  • Responsible for extracting data from Oracle, Sybase, and Flat files
  • Responsible for the Data Cleansing of Source Data
  • Responsible for Performance Tuning in Informatica Power Center.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Extensively made use of sorted input option for the performance tuning of aggregator transformation.
  • Prepared the error handling document to maintain the error handling process.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database
  • Created various tasks like Pre/Post Session, Command, Timer and Event wait.
  • Extensively made use of sorted input option for the performance tuning of aggregator transformation
  • Extensively used SQL Override function in Source Qualifier Transformation
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and Master Outer Join in the Joiner Transformation.
  • Worked with Update strategy transformation using functions like DD INSERT, DD UPDATE, DD REJECT, and DD DELETE

Environment: Informatica Power Center 9.0 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), Oracle 10g, SQL, PL/SQL, Teradata, Flat Files, Autosys, Star Team, UNIX, Linux, Windows XP .

Confidential

SQL DBA Developer

Responsibilities:

  • Primary responsibilities include profiling, performance tuning & redesign, high availability server network and disaster recovery.
  • Migrated SQL server 2000 to SQL Server 2005 in Microsoft Windows Server 2003 R2 Enterprise Edition SP2.
  • Installed and administered two node cluster Active/passive on Microsoft Windows Server 2003 R2 Enterprise Edition SP2 on SAN environment.
  • Migrated DTS packages from SQL Server 2000 to SSIS SQL server 2005.
  • Extensively worked on SQL Server Integration Services (SSIS).
  • Expert in Using DMV's, Performance dash board, Mirroring, database snapshot.
  • Solving the Bugs Reflected by the Testing team.
  • Rebuilding the indexes at regular intervals for better performance.
  • Recovering the databases from backup in disasters
  • Involved in trouble shooting and fine-tuning of databases for its performance and concurrency.
  • Involved in Source Data Analysis, analysis and designing mappings for data extraction also responsible for Design and Development of SSIS Packages to load the Data from various Databases and Files.
  • Expert in implementing the snapshot isolation and DDL triggers.
  • Responsible for monitoring and making recommendations for performance improvement in hosted databases. This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases.
  • Implemented in new T-SQL features added in SQL Server 2005 that are Data partitioning, Error handling through TRY-CATCH statement, Common Table Expression (CTE).

Environment: SQL Server 2000/2005 Enterprise, Windows Enterprise Server 2003, IIS 6.0, .NET, Microsoft Visio 2003, SFTP, Microsoft Integration Services.

We'd love your feedback!