Splunk Engineer Resume
SUMMARY
- 16 years of experience in information management end - to-end data warehousing, business intelligence, data integration and data migration & Security.
- 1.5 years of extensive experience in Splunk, Linux/UNIX, PLSQL, As a Splunk Developer, performed activities including requirement analysis, design and implementations of various client server based applications using
- Expertise in Installation, Configuration, Migration, Trouble-Shooting and Maintenance of Splunk, Passionate about Machine data and operational Intelligence.
- 1.5 years of experience in Splunk developing dashboards, forms, SPL searches, reports and views, administration, upgrading, alert scheduling, KPIs, Visualization Add-Ons and Splunk infrastructure.
- Splunk Technical Specialist responsible for the design, performance, implementation and capacity of the Splunk Platform.
- Worked as Splunk support for System Admins, Content Manager and developers to ensure the delivery of Splunk best practices and standards related to each job function
- Worked on platform Architecture and Capacity planning also on several platform Upgrade and Optimization
- Analyzed Log Files, Thread Dumps and make recommendations to improve the efficiency of the application running on the host
- Has work experience with scripting languages such as Bash, Python and perl script for more than four years
- Good command in writing Splunk searches; Splunk Infrastructure and Development expert well-versed with Splunk architecture and design.
- Experience with C++, Linux and BASH scripting
- Headed Proof-of-Concepts (POC) on Splunk ES implementation, mentored and guided other team members on Understanding the use case of Splunk.
- Expertise in BigData, Hadoop, Splunk, JVM, Python technologies Generates reports on KPI analysis as needed with SPLUNK.
- Familiar with Service Oriented architecture and web services integration (SOAP, WSDL, REST KPI)Expertise in customizing Splunk for Monitoring, Application Management and Security as per customer requirements and industry best practice.
- Experience in responding to requests and incident tickets within defined Service Level Agreements
- 13 years’ experience ETL processes from various sources and archive systems into data warehouse/data marts using IBM Data Stage 7.1, 7.5, 8.5, 8.7, 9.1 & 11.5
- Organizing and planning project from inception to completion
- Monitoring, addressing project risks and driving project to completion
- Business Development by identifying opportunities and proposing solutions for value add for effective use of BI applications, leading aspects of proposal development process, scope effort and cost of Solution, develop SOW
- As member of Business Intelligence Performance Management practice conceptualized, designed solution approach for data warehouse implementations, implemented best practices and performed BI assessment of client projects
- UNIX Shell scripting as part of file manipulation and strong knowledge in scheduling DataStage jobs using Crontab.
- Performance tuning and optimization of the DataStage parallel jobs and server jobs
- Integration of various data sources like Oracle, SQL Server, and DB2 using DataStage
- Extensive experience with relational and dimensional data modeling for creating logical and physical design of database and ER Diagrams using data modeling tool like ERwin
- Leader and team player with proven conceptual, analytical and problem solving capabilities
- Worked on Hive for data Extraction from EDL
- Expert in using stages, Standardization, Scrubbing, Matching and Survivorship rules using Quality Stage.
- Ability to plan, organize, direct, implement and evaluate processes to lead people and manage resources to achieve desired result
TECHNICAL SKILLS
Data Warehouse: Data Warehouse Development Life Cycle, Dimensional Data Modeling, OLAP, Normalization, Demoralization, Hierarchy, Star Schema, Snowflake, Dimension, Fact, Logical/Physical Data Modeling, Slowly Changing Dimension
Databases: Oracle 7.x/ 8/8i/9i / 10g Teradata, MySQL, DB2,SQL Server
ETL: DataStage 7.X and 8.5, 8.7, 9.1 & 11.5 (InfoSphere), and Informatica 8.1.1Programming Languages C, C++, DHTML, JAVA, PL/SQL, SQL, Perl, UNIX Shell Scripting and ASP
Data Modeling Tool: Erwin
SDLC Methodologies: Agile Scrum, Test Driven Development, Waterfall
SW Development Tools: Developer 2000, Dreamweaver, Oracle Forms and Oracle Reports
Software Development: Software Engineering, System Design and Analysis and Modular Development
Operating Systems: Windows 2000 Server, Windows NT 4.0, UNIX, IBM AIX 5.x/6.1, and Red hat Linux 5.5
Others: Microsoft Visio 2003, and Tivoli 9.x, Mainframe (ISPF), ZEKE Scheduling tool, IBM Information Analyzer,Mulesoft,Splunk
PROFESSIONAL EXPERIENCE
Confidential
Splunk Engineer
Responsibilities:
- Owning all technical aspects of splunk build and dashboard development. Performing hands-on architecture, design, and development of systems
- Develop filters to assist in the identification of significant events
- Created Splunk knowledge objects (e.g. fields, lookups, macros, etc.)
- Administer and Manage Splunk Apps to perform customized functionalities for Big Data platforms
- Maintain, Manage and Monitor Splunk Infrastructure (Identify bad searches, dashboards and manage overall health of splunk)
- Perform Content Development to properly identify data feeding SIEM’s and correlation of events
- Assist in the proper operation and performance of Splunk, loggers and connectors.
- Designed Splunk Enterprise 6.5 infrastructure to provide high availability by configuring clusters across two different data centers.
- Installed, Configured, Maintained, Tuned and Supported Splunk Enterprise server 6.x/5.x.
- Architect and Implement Splunk arrangements in exceptionally accessible, repetitive, conveyed figuring situations.
- Performed Field Extractions and Transformations using the RegEx in Splunk.
- Responsible for Installing, configured and administered Splunk Enterprise on Linux and Windows servers.
- Installation and implementation of the Splunk App for Enterprise Security and documented best practices for the installation and performed knowledge transfer on the process.
- Worked on installing Universal Forwarders and Heavy Forwarders to bring any kind of data fields into Splunk.
Confidential
Principal Consultant (Splunk) & ETL
Responsibilities:
- Designing and building new log & data mining services including Planning & supporting of execution of assembling
- Perform data mining and analysis, utilizing various queries and reporting methods
- Assist internal users of Splunk in designing and maintaining production-quality dashboards
- Worked with Client engagements and data onboarding and writing alerts, dashboards using the Search Processing Language (SPL).
- Splunk expert level understanding with Splunk Enterprise in Event management and Tags
- Strong knowledge of Windows, Linux, and UNIX operating systems.
- Install, configure, and troubleshoot Splunk. Experience with regular expressions and using regular expressions for data retrieval. Work with application owners to create or update monitoring for applications.
- Provided regular support guidance to Splunk project teams on complex solution and issue resolution.
- Writing Splunk Queries, Expertise in searching, monitoring, analyzing and visualizing Splunk logs.
- Experience in alert handling, standard availability and performance report generation. Experience in root cause analysis of post-production performance related issues through Splunk tool.
- Designing, optimizing and executing Splunk-based enterprise solutions.
- Installed and configured Splunk Universal Forwarders on both UNIX (Linux, Solaris, and AIX) and Windows Servers.
- Hands on experience in customizing Splunk dashboards, visualizations, configurations using customized Splunk queries.
- Monitored the Splunk infrastructure for capacity planning, scalability, and optimization.
- Experienced in using Splunk- DB connect for real-time data integration between Splunk Enterprise and rest all other databases.
- Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports.
- Responsible with Splunk Searching and Reporting modules, Knowledge Objects, Administration, Add-On's, Dashboards, Clustering and Forwarder Management.
- Monitored license usage, indexing metrics, Index Performance, Forwarder performance.
- Played lead role from Analyzing Requirement, design & Implement Loans, Capital Market, Security, Bloomberg data into Moody’s RO Process using ETL DataStage, This was NRCR's biggest project in BNS.
- Working on DataStage 11.5 for the data extraction from Enterprise Data Lake.
- Developed Hive programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
- Shared responsibility for administration of Hadoop, Hive.
- Assisted with data capacity planning and node forecasting
- Designed and worked with Hadoop Infrastructure team for code Implementation Process
- Designed the source feed processes End to End from getting data from Source Sending it to Staging, applying business rule and working with Moody’s team in loading the data to RO
- Assisted in migration of data from existing application to RO application
- Developed ETL solutions using DataStage to populate the Data Warehouse before the valuation feeds can be generated.
- Work closely with QA, Deployment, Infrastructure and business teams to ensure comprehensive testing and solution deployment
- Actively involved in streaming the support process and scheduling process using Autosys.
- Developed process in UNIX in collaboration with other team for file transfer purposes
- Designed and developed a model to efficiently mask/unmask the Customer Personal Identifiable Information across multiple passes to meet the continuously evolving requirements while staging data between the on-premise environment and other platform
Confidential
BI Consultant
Responsibilities:
- Reviewing design, technical, code and Mantis Tickets
- Leader/Coordinator onsite and offshore team
- Analyze source data system for Confidential based on the mapping documents and categorized them based on source system
- Reviewed Confidential data models and design DataStage Jobs for the offshore developers
- Worked with testing team to find any data issues
- Work with business in creation of the inventory document as part of the conversion project, gap analyses based on Inventory document and mapping document
- Work with conversion team to resolve issues
- Part of the migration project to port claims data from mainframe legacy in to new vendor application (Guide wire)
- Designed and developed jobs for contact and contact roles defined at claims level
- Work with Confidential business analyst team to translate business requirements into transformation rules
Environment: IBM Info Sphere DataStage v8.5 DB2 v10/v11,Oracle 9i/10g, HP-UX, Linux 5.5
Confidential
Senior ETL Consultant
Responsibilities:
- Managed offshore ETL team of 3 DataStage developers.
- Worked on the store order control and store order item interface in DataStage 8.1.
- Improved the existing jobs, currently running job that has bugs
- Conducted detailed data analysis / reporting using IBM Web Sphere Information Analyzer for the multiple sources from various systems
- Worked with business to fix job that had problems and recommended enhancement to existing process
- Performance improvement of EDW project and designed new DataStage job
- Worked on the migration project moving DataStage code from DataStage 8.1 to DataStage 8.
- Prepared ETL project estimation and measurements for project milestones
- Provided project delivery with the defined quality metrics and process
Environment: IBM InfoSphere DataStage and QualityStage v8.1/v8.7, Oracle 9i/10g, Linux 5.5, Shell scripting, CA ERwin, CRON Scheduler, TOAD
Confidential
Senior ETL Consultant
Responsibilities:
- Recommended roadmap and an approach for implementing the data integration architecture (with cost, schedule & effort estimates)
- Impact analysis, design, development and testing (mainframe (ISPF), Teradata database and ZEKE for scheduling)
- Enforced complex business rules while populating the new transaction code data to different data marts like CSDM, USDM and RBDM
- Liaison with project managers, architects, quality assurance analysts and system analysts
- Delivered the project with tight time lines while ensuring all the Confidential project implementation processes were fulfilled
- Worked on EDW projects for RBDM (Royal Bank Data Mart)
- Assisted with the requirements analysis, gap analysis and subsequent development
- Conducted unit tests and assist in test preparations to ensure data integrity, data quality and program quality
- Worked with offshore team during testing and shadowed them during MTP
- Worked on various modules for CSDM (Credit Card Services Data Mart) and USDM (U.S. Data mart) and RBDM module that reported production issue, which were part of SCR (Small Change Request)
- Fixed production issues in jobs (JCL) developed in mainframe.
- Used Teradata 11.x
- Provided change request, impact analyses, activity record, technical design documentation, PIV (Post Implementation Verification Document)
Environment: DataStage, Mainframe (ISPF), Teradata 11.x, UNIX, ZEKE Scheduling
Confidential
Senior ETL Consultant/Data Modeller
Responsibilities:
- Led the project spanning the full system development life cycle
- Led team that delivered solutions across multiple technical environments
- Direct experience in the architecture, integration, presentation, and governance of data in both OLTP (Transactional) databases and OLAP (Analytical) business intelligence data warehouses.
- Data Profiling using Information Analyzer for Column Analysis, Primary Key Analysis & Foreign Key Analysis
- Provided solution architecture which included - Information model, ETL architecture and presentation layer for the specialty EDW.
- Involved as primary on-site technical lead during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software (QualityStage v8, Information Analyzer, Profile Stage).
- 1 Years of Data profiling experience on IBM Information Analyzer 8.0.1/ Profile stage (validates data values and column/table relationships, source to target field mappings, source system profiling and analysis ) join with SME & Data modeler
- Designed and implemented facts, dimensions and OLAP using dimensional data modeling standards in SQL Server 2008 that maintained data.
- Identified and defined Fact relationships. Maintained and deployed OLAP.
- Developed fact measures and multiple dimension hierarchies based on the OLAP reporting needs
- Designed product data integration module across the various lines of business
- Identified and promoted best practices and patterns for data modeling
- Provided oversight for all activities related to data cleansing, data quality and data consolidation
- Used standard data modeling methodologies and processes
- Wrote complexStored Procedures, coding and triggersto capture updated and deleted data from OLTP systems
- Worked on retail management system and warehouse management system process for Confidential
- Involved in design and development of process to send XML files from store to the warehouse and vice versa
- Worked within an information and communication technologies environment
- Facilitated requirements and design sessions with stakeholders
- Used the Agile methodology
- Created Oracle database solutions
- Architected and designed applications
- Created data models
- Prepared high level ETL design, process flow, source to target mappings, design of data standardization
- Provided deduplication and cleansing modules for client and business data, source names to achieve consolidated views of these data sets.
- Offered business requirement and design as well as developing DataStage process
- Recommended performance improvement solution to existing DataStage process and designed new jobs if required
- The ETL was designed to operate in closed loop mode as the entities and the attributes of the business entity
- Provided data exception reports showing critical violations of data/business rules
- Analyzed source systems for validation of business rules, data Integrity and data cleansing requirements; Vendor and Material master data
Environment: ICT, DataStage 7.5 and 8.5,8.7 (InfoSphere), Quality Stage and Information Analyzer.Oracle 10 g, HP UNIX, Red Hat LINUX 5.5,DB2, MS SQL Server, Agile, MS Project
Confidential
Senior DataStage Developer
Responsibilities:
- Worked on many application including e-one, profitability analysis management application, A&C application to optimize and improve performance
- Modified architecture to improve performance
- Led Solution Delivery team across multiple technical environments; provided solution architecture which included information model, ETL architecture and presentation layer for the specialty EDW
- Architected and designed applications
- Made use of Java Scripts (HTML Items) in Report Studio.
- Working on Report Enhancements in Cognos.
- Supported reporting team in making reports and in finding issues related to data load in nightly reports.
- Prepared data flow diagrams, identified major streams like data conversion, historical, data retention archival, data quality, reconciliation
- Led offshore team of 10 people
- Directed all phases of the SDLC including the requirements analysis, architecture design, development, testing, deployment and ongoing support for all ETL applications
- Ensured all implementations are consistent with Audible's SDLC standards
- Facilitated requirements and design sessions with stakeholders
- Developed software application
- Used reporting data store solutions
- Used the Agile methodology
- Used Microsoft Visual Studio Team Edition
- Created Oracle database solutions
- Created data models
- Performed data profiling and analysis of source systems
- Provided data exception reports showing critical violations of data/business rules
- Assessed data quality and conformance issues between old internal source systems and the newly acquired source systems
- Determined the best cleansing strategy based on the data quality assessment
- Worked with GL module (General ledger) developed in Informatica
- Led the team that supported the GL application and provided recommendation on performance optimization
- Provided design change recommendation on the existing and new development for GL
- Performed data integration with E-one system using JD Edwards with DataStage
- Provided technical recommendations for optimized data access and retention for the data warehouse
- Provided solution architecture, high Level design documents
- Built repository of re-usable knowledge assets and implement best practices, concept documents for client education
- Managed the development team and the production support teams
- Participated in staff recruitment activities for the engagement and mentoring team members
- Technically involved with third party vendor in configuring InfoSphere DataStage 8.5 in DEV/QUA and Prod environments for DataStage upgrade project
- Migrated jobs from 7.1 to 8.5 (InfoSphere) with offshore technical team
- Designed and developed application for job auditing to compare data populated through jobs in both new and old environment
- Used the Audit jobs for performance testing
- Responsible for UNIT, system and integration testing
- Developed test scripts and test plan
- Participated in UAT (User Acceptance Testing by COGNOS Reporting Team)
- Worked with business analysts closely in gathering specific business change requests; ensured did not impact negatively existing applications (functional aspects of application)
- Provided RFS work for hiring third party contractor for specific requirement
- Worked with contractor closely in making SOW and also allotting work ensuring met on given time lines
- Provided design and development of processes for cement/Gypsum/GL and A&C business units
- Prepared technical design specification documents for business to approve the change recommended
- Worked on data integration module for order to cash (O2C) that gets data from E-One and uses in DataStage
- Supported and analyzed the GL module developed in Informatica 8.1.1
- Played role in performance tuning of daily nightly load for cement and gypsum processes
- Provided design/development of different Confidential data mart processes:
- Profitability Analysis Model), Quality Information Management System
- Production Distribution Planning, Advance Planning and Scheduling
- Strategic Network Optimization, Enterprise Performance Management, Procurement 2 Pay
- Provided requirements gathering and source data analysis
- Identified business rules for data migration and for design as well as developing data marts
- Provided product development lifecycle including business requirements, functional specifications, architectural specifications, development, test plans and documentation
- Played key role in data mirror migration project; analyzed impact of the migration
- Provided recommendation based on data dictionary for the mirror database so the expected performance is achieved
- Provided testing of the data mirror migration in order to capture the performance gain or loss after the change
Environment: ICT, DataStage 7.1 and 8.5 (InfoSphere), Informatica 8.1.1,Oracle 10 g, HP UNIX, Red Hat LINUX 5.5, OVPM, Agile, Microsoft Visual Studio Team Edition, MS Project
Confidential
Senior DataStage Developer
Responsibilities:
- Managed the project spanning the full systems development life cycle
- Led the team that delivered solutions across multiple technical environments
- Developed order data mart for the business, which provided order details from different systems such as Siebel CRM/MSLV/IPACT
- Used DataStage to develop ETL jobs
- Developed DataStage jobs for the end-to-end process loading into the data mart
- Worked extensively on various Confidential process such as Violation Detection System/IPACT
- Used SQL and PL/SQL with the leadership role responsibility
- Provided DataStage training to other team members
- Created data warehouse
- Used reporting data store solutions
- Created Oracle database solutions
Environment: ICT, DataStage 7.5 (Manager, Designer, Director) DB2, Oracle 10G, 11G (PL/SQL), ERwin 4.0
Confidential
Responsibilities:
- Facilitated requirements and design sessions with stakeholders
- Developed strategies for ETL data from various sources into data warehouse/data marts
- Used Ascential DataStage (Manager, Designer and Director)
- Led the offshore team to implement that change
- Developed Batch Scripts with DataStage internal command to retrieve DataStage job status
- Wrote Batch Scripts to run and schedule DataStage jobs
- Created DataStage jobs, batches and job sequences and tuned them for better performance
- Performed data migration and conversion.
- Used the Agile methodology
- Worked on Hash-file Stage for look-ups, Oracle Bulk Load, ODBC, Hashed file, Aggregator, Sequential file, Link Petitioner and Link Collector stages
- Worked extensively with several stages like DRS, ODBC, merge etc.
- Imported and exported data between ERwin, DataStage
- Developed a knowledge base which will load data from any POS source in to databases and for further reporting purposes of organizational analysis
- Investigated impact of adding new business requirements and procedures into existing data mart
- Scheduled meetings with business people, business and technical documentation, walkthrough step by step developmental phases with business managers
- Provided technical requirements team discussion and work delegation planning
- Analyzed test cases and testing data requirements
- Provided implementation plan documentation and plan execution
- Developed the DataStage manual for JDA DataStage onsite developer
- Developed sample jobs for easy understanding of DataStage
- Helped customers in answering their data stage related problems
- Helped them in making job as per their requirement
- Helped the customer in getting better and easy understanding of DataStage ETL tool
- Provided DataStage development speedup
- Used MS Project for project management
Environment: ICT, DataStage 7.5 (Manager, Designer, Director) PL/SQL, DB2, MicroStrategy 8, Windows 2000, Oracle 9i, ERwin 4.0, TOAD, MySQL, Perl, ASP, Universe Database, Agile, MS Project