We provide IT Staff Augmentation Services!

Splunk Developer Resume

5.00/5 (Submit Your Rating)

Carmel, IN

SUMMARY

  • Around 7 years of experience in Splunk Developer, Information Technology field with strong emphasis in Business Intelligence (BI) and Data Warehousing (DW) associated projects and RDBMS applications under different environments.
  • Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports.
  • Various types of charts Alert settings Knowledge of app creation, user and role access permissions. Creating and managing app, Create user, role, Permissions to knowledge objects.
  • Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing.
  • Field Extraction, Using Ifx, Rex Command and Regex in configuration files.
  • Experience in Operational Intelligence using Splunk.
  • Knowledge about Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
  • Time chart attributes such as span, bins, Tag, Event types, Creating dashboards, reports using XML. Create dashboard from search, Scheduled searches o Inline search vs scheduled search in a dashboard
  • Knowledge of Extract keyword, sed, Knowledge objects, Knowledge of various search commands like stats, chart, time chart, transaction, strptime, strftime, eval, where, xyseries, table etc. Difference between event stats and stats.
  • Business Activity Monitoring (BAM) concepts such as IBM
  • Use techniques to optimize searches for better performance, Search time vs Index time field extraction. And understanding of configuration files, precedence and working.
  • Props.conf, transforms.conf, inputs.conf, outputs. conf setting up a forwarder Monitor stanza in inputs.conf
  • Extensive Data Warehouse experience using Informatica 7/8.x/9 Power Center tools (Source Analyzer, Mapping Designer, Mapplet Designer, Transformation Designer, Repository Manager, and Server Manager) as ETL tool on Oracle /DB2 Database.
  • Extensive experience in writing Packages, Stored Procedures, Functions and Database Triggers using PL / SQL and UNIX Shell scripts. Also handled Oracle utilities like SQL Loader, import etc.
  • Experience in data mart life cycle development, performed ETL procedure to load data from different sources into Data marts, Data warehouse using Informatica Power Center.
  • Experienced in all data processing phases, from the Enterprise Model, Data Model (Logical and Physical Model), and Data Warehousing (ETL).
  • Extensive experience in developing complex mappings from varied transformations like Router, Filter, Sorter, Connected and Unconnected lookups,, Normalizer, Expression, Aggregator, Joiner, Union, Update Strategy, Stored Procedure and Sequence Generator Etc.
  • Database experience using Oracle 11g/10g/9i/8.x/7.x, MS SQL Server 2008, Teradata and DB2.
  • Working knowledge of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLTP and OLAP.
  • Strong experience using SQL, PL/SQL Procedures/Functions, Triggers and Packages.
  • Good understanding of Views, Synonyms, Indexes, Joins, and Sub - Queries.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.5/ 9.1/8.5/8.1.1/7.1.2, Informatica Designer, Workflow Manager, Work flow Monitor, Data mart, Mapplet, Transformations, Autosys, SQL*Plus.

Data Analysis: Requirement Analysis, Business Analysis, detail design, data flow diagrams, data definition table, Business Rules, data modeling, Data Warehousing, system integration.

Data Modeling: Dimensional Data Modeling (Star Schema, Snow-Flake, FACT-Dimensions), Conceptual Physical and Logical Data Modeling, ER Models, OLAP, OLTP concepts.

Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2012/2008/2005/2000, MS Access.

Programming: SQL, PL/SQL, SQL Plus, Unix Shell Scripting, C

Environment: Win 9x/NT/2000/XP, Unix.

PROFESSIONAL EXPERIENCE

Confidential - Carmel, IN

Splunk Developer

Responsibilities:

  • Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports.
  • Provide regular support guidance to Splunk project teams on complex solution and issue resolution. Created Dashboards, report, scheduled searches and alerts.
  • Knowledge about Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
  • Integrated Service Now with Splunk to generate the Incidents from Splunk
  • Worked on DB Connect configuration for Oracle, My SQL and MSSQL.
  • Created many of the proof-of-concept dashboards for IT operations, and service owners which are used to monitor application and server health.
  • Field Extraction, Using Ifx, Rex Command and Regex in configuration files.
  • Various types of charts Alert settings Knowledge of app creation, user and role access permissions. Creating and managing app, Create user, role, Permissions to knowledge objects.
  • Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing.
  • Involved in standardizing Splunk forwarder deployment, configuration and maintenance across UNIX and Windows platforms.
  • Worked on setting up Splunk to capture and analyze data from various layers Load Balancers, Web servers and application servers.
  • Captured data from various front end, middle ware application
  • Dashboards were created to monitor the traffic volume across, response times, Errors, Warnings across
  • Use techniques to optimize searches for better performance, Search time vs Index time field extraction. And understanding of configuration files, precedence and working.
  • Create dashboard from search, Scheduled searches online search vs scheduled search in a dashboard

Environment: SPLUNK 6.0.7, Linux, Unix, Oracle 11g, MS SQL Server 2012, SQL.

Confidential, Boston MA

SR, Informatica Developer

Responsibilities:

  • Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
  • Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Extensive experience on setting up the Splunk to monitor the customer volume and track the customer activity.
  • Have worked in involving capturing, analyzing and monitoring Confidential online banking application.
  • Experience on setting up the Splunk in Confidential online Banking, creating dashboards, alerts to monitor the bank of America online banking front end and middle ware applications.
  • I was involved in triaging and resolving various complex production issues by analyzing data from various monitoring tools and application logs. This involves working with various teams real time on a conference call.
  • Also, worked on code changes for various maintenance and customer reported bugs as part of production support.
  • Worked in Level 3 production support team in the bank's phone banking channel, which includes VRU (Voice Response Unit) and CTI (Computer Telephony Integration).
  • My achievements include complete revamp of the Confidential testing and production environment which include of retiring the hardware and reducing around 1 Million dollar savings in the maintenance and cost of hardware.
  • I also have hands on experience in application design, configuration management, performance tuning, code reviews, and object oriented design, release and change management experience.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Running and monitoring daily scheduled jobs by using Work Load manager for supporting EDW (Enterprise Data Warehouse) loads for History as well as incremental data.
  • Investigated failed jobs and writing SQL to debug data load issues in Production.
  • Involved in Transferring the Processed files from mainframe to target system.
  • Supported the code after postproduction deployment.
  • Interacted with the Source Team and Business to get the Validation of the data.

Environment: Informatica Power Center 9.5/9.1,, Linux, UNIX, SQL, PL/SQL, MS Access, UNIX, BO XI R2, Erwin,, Shell Scripts, Rapid SQL, PVCS, Visio, AutoSys.

Confidential, Atlanta,GA

Informatica Developer

Responsibilities:

  • Designed and developed ETL process using Informatica tool.
  • Worked with various Active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation
  • Responsible for extracting data from Oracle, Sybase, and Flat files
  • Responsible for the Data Cleansing of Source Data
  • Responsible for Performance Tuning in Informatica Power Center.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Extensively made use of sorted input option for the performance tuning of aggregator transformation.
  • Prepared the error handling document to maintain the error handling process.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database
  • Created various tasks like Pre/Post Session, Command, Timer and Event wait.
  • Extensively made use of sorted input option for the performance tuning of aggregator transformation
  • Extensively used SQL Override function in Source Qualifier Transformation
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and Master Outer Join in the Joiner Transformation
  • Worked with Update strategy transformation using functions like DD INSERT, DD UPDATE, DD REJECT, and DD DELETE

Environment: Informatica Power Center 9.0 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), Oracle 10g, SQL, PL/SQL, Teradata, Flat Files, Autosys, Star Team, UNIX, Linux, Windows XP .

Confidential

Informatica Developer

Responsibilities:

  • Extensively worked on Expressions, Source Qualifier, Union, Filter, Sequence Generator, sorter, Joiner, Update Strategy Transformations.
  • Developed Informatica Mappings to populate the data into dimension and Fact tables for data classifications to end developers.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Used Mapping, Sessions Variables Designed & developed ETL processes based on business rules, job control mechanism using Informatica Power Center 7.1. Re-engineered on existing Mappings to support new/changing business requirements.
  • /Parameters, and Parameter Files to support change data capture and automate workflow execution process to provide 24x7 available data processing.
  • Created UNIX shell scripts to extract data from various sources like Oracle and Flat Files and loaded the transformed data into targets database and to schedule the session and workflow.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Prepared the error handling document to maintain the error handling process.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Monitored batches and sessions for weekly and Monthly extracts from various data sources to the target database.
  • Involved in writing UNIX Shell scripts (Pre/Post Sessions commands) for the Sessions & wrote Shell scripts to kickoff workflows, unscheduled workflows, get status of workflows.
  • Tuned SQL Statements, Mappings, Sources, Targets, Transformations, Sessions, Database, Network for the bottlenecks, used Informatica parallelism options to speed up data loading to target.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions (SCD).
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, and Update Strategy for Designing and optimizing the Mapping.
  • Created various tasks like Pre/Post Session, Command, Timer and Event wait.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Created Materialized view for summary data to improve the query performance.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.

Environment: Informatica Power Center 7.1.2, Oracle, Mainframe, DB2, COBOL, VSAM, SQL, PL/SQL

Confidential

SQL Developer

Responsibilities:

  • Production Implementation and Post Production Support.
  • Writing the Stored Procedures, checking the code for efficiency.
  • Maintenance and Correction of Transact Sequel Server (T-SQL) Statements.
  • Daily Monitoring of the Database Performance and network issues.
  • Administering the MS SQL Server by Creating User Logins with appropriate roles, dropping and locking the logins, monitoring the user accounts, creation of groups, granting the privileges to users and groups.
  • Managing databases, tables, indexes, views, stored procedures.
  • Enforcing business rules with triggers and user defined functions, troubleshooting, and replication.
  • SQL Authentication
  • Rebuilding indexes on various tables.
  • Preparing Test Cases and performing Unit Testing.
  • Review of Unit and Integration test cases.

Environment: MS SQL Server 6.5, SQL Server 7, MS SQL Server 2000

We'd love your feedback!