Sr. Big Data Lead / Devops Engg Resume
PROFESSIONAL SUMMARY:
- 12.5+ years of Strong IT experience in software Platform building and successful implementation for Production, Development, Staging and QA environment.
- 3+ years of experience in Administration, Configuration and managing Open source technology like Spark, Kafka, Zookeeper, Docker, Kubernets on RHEL.
- Successfully implemented CICD (DevOps Automation) using GIT, Jenkins, Docker images on kubernet cluster.
- Successfully setup platform on standalone spark cluster, Oracle Golden gate, kafka and zookeeper cluster, Neo4j.
- Knowledge of of Cloudera and Horton works distributions.
- Hands on experience on NoSQL Database like Vertica, Mongo DB.
- 6+ years of experience in ETL methodologies using IBM Websphere Datastage for extraction, transformation, manipulation and aggregation of data, NETeXPERT rule writing, centura team developer.
- 7+ years of hands on experience on various database technology - Oracle, MySQL, Teradata, BTEQ script and PL/SQL.
- 6+ yeas of hands on experience in Data Analytics of telecom network, various Call detail records including E911, predictive data analysis, time series data analysis.
- Experience in database designing using E-R Model, database programming skills in Oracle and data modeling.
- Provide production support, Development Integration test, User Acceptance testing and aid in root cause analysis for trouble tickets.
- Goal oriented and innovative with excellent people-skills and ability to manage change with ease including diverse situations, technologies and domains.
- Excellent problem solving and strong analytical skills.
- As part of Agile team, participate in scrum grooming sessions, plan and estimate capacity vs availability of resources and task.
TECHNICAL SKILLS:
Big Data/ Open source technologies: Spark 1.6.2/2.1.0 , Pig, Hive, splunk, Neo4j 2.3.2, KNIME 3.1.x, Hue, zookeeper 3.4.8, R, Kafka 2.11.
CICD tools: GIT, Jenkins, Nexus, Docker, Kubernet.
Scripting language: Python, Scala, shell, Java.
Database: Oracle, PL/SQL, MySQL, Teradata.
NoSql: Mongo DB 3.2.9, Vertica.
Reports: Crystal Reports, Cognos Report.
Other Tools: TIBCO BPM, TIBCO BW, HP Quality center, JIRA, I-TRACK, Rally Dev.
PROFESSIONAL EXPERIENCE:
Confidential
Sr. Big Data Lead / DevOps Engg
Responsibilities:
- Setup, configure and maintain Development, QA and Production Platform of RAPTOR.
- Successfully Implementing microservices using GIT, Jenkins to build docker images and implement on kubernet cluster (CI/CD) using ATT Eco pipeline.
- Configure and administrate spark standalone cluster.
- Configure and administrate kafka and zookeeper cluster, Oracle Golden gate replicates and get data on kafka.
- Configure and administrate kafka topics as per requirements.
- Developed and tested spark jobs in scala to fetch stream of data from kafka topic.
- Install Redis, Node JS, Cherrypy, anaconda, Neo4j on RHEL.
- Tune mongoDB query and mongo collections design to improve performance.
- Monitor spark jobs on UI and Tune spark environment and code to improve performance of Spark job.
- Review the requirements in each iteration, define tasks, estimate efforts and deliver within timelines.
- Developed shell script to transfer file and configure it on cron job.
- Handle Deployment and operation activities to build platform.
- Responsible for L1 support of DEV, QA and Production environment.
- Lead offshore team.
Confidential
Lead ETL / DevOps Engg.
Responsibilities:
- Developed DataStage Job utilizing sequential/complex file stage, transform, filter, modify, join, merge, remove duplicates.
- Develop store procedure (PL/SQL) and script to dynamically monitor data quality of processed data by comparing it with raw data to detect issues like data mismatch, data loss, KPI calculation errors, wrong data aggregation.
- Create views using DB links for accessing data from another Database.
- Analyze variety of CDR data on Teradata/Vertica across various network elements to validate call flows (Mobility) operating on different 2G/3G/VO-LTE technologies
- Implemented apache Kafka on Cambria (UEB) services.
- Troubleshoot data to report any gap in algorithms, KPI calculation, call pattern issues to stack holders.
- Carryout analysis on BTEQ script using simulated LAB Call detail records as well as live network calls for AT&T.
- Review Data Mapping, Data model, logic for data flow, extraction and aggregation and Data retention policies with principle architect, development and test team of new KPI/deliverables for E2E system.
- Review requirements in each iteration, define tasks, estimate efforts and deliver within timelines.
- Check Data Quality of E2E System including ETL tool, Predictive analytics layer and different integrated system.
- Understand and test Algorithm to identify different call patterns (mobility) and its complex KPI calculation.
- Conduct client demo prior and after scheduled product releases to get client feedback as part of Agile methodologies.
- Coordinate production deployment activities, support ORT phase for users, conduct RCA on production tickets and track it.
- Train resources, Lead offshore team.
Confidential
Lead Software Developer
Responsibilities:
- Delivered OSS southbound integration includes NetExpert SA suit- Fault, Performance & SLA module and northbound integration with Remedy.
- Handled Go Live activities. Co-ordinate onsite and offshore activities.
- Handled System Administration part of NetExpert which include Installation, configuration and integration of different modules (FMS, PMS and DMP) at SIT, UAT and Production environment.
- Executed SIT & UAT plan at onsite.
- Conducted NetExpert application and SOP for Client in Fiji.
- Involved in preparation of test cases, documents and final handover documents to be delivered to Client.
- Developed south bound interface in NetExpert with various EMS like Marconi, Provision, Alcatel1353, AXE Switches and NGN for FMS and PMS.
- Developed of cluster monitoring and start up script catering to fail over scenarios.
- Designed and Developed Alarm co-relation policies and Escalation policy using DMP.
- Designed and developed Fault & Performance reports in Crystal reports.
- Designed and developed Heartbeat monitoring functionality.
- Done requirement gathering, BRS preparation, Integration with NX for NGN implementation.
Confidential
Software Developer
Responsibilities:
- Involved in requirement gathering and analysis phase.
- Solution design, implementation, configuration and testing of SLA, FAULT, Alarm, Network Diagrammer modules.
- Designed & developed COGNOS reports.
- Prepared UAT Test cases.
Confidential
Software Developer
Responsibilities:
- Customized product modules of MAXIMO for WFM, TT and SLA manager.
- Designed and developed database tables, store procedure and triggers.
- Prepared UAT Test cases.
Confidential
Software Developer
Responsibilities:
- Preparing Functional Specification Document.
- Develop portal and solve their queries for Functional Issue.
- Preparation of Test cases and Test Data. Internal POC on TIBCO BW and Web Method with Clarity(OSS)
- Design system Integration for Clarity using TIBCO BW and via web method
- Documentation of All API’s and their parameter details including STATE diagram of NMS.
- Design and Test Integration of Clarity (OSS) using TIBCO BW and Web Method.
Confidential
System Analyst & Team Leader
Responsibilities:
- Analyze/Monitoring Order uploading which include customer, location & service creation, MACD operation which include order creation at OSS and updating all BSS layer application after Closer of order and also maintain SLA for same.
- Error Analysis/handling in end-to-end Order uploading and MACD operation by checking DB and different system logs.
- Deployment of new Release/Patch in Production, Performing UAT for end-to-end checking properly in production environment as per SRS.
- Analyze OMS layer operation, CPU & Memory utilization of Server for enhancement of system resources and server scaling for Business Requirement and smooth operation.
- Define Rule base in TIBCO Hawk for Error detection and monitoring TIBCO operation.
- Monitoring of OMS interface with ADC, CRM, eNIMS, eCMDE, Payment Engine (BSS) & Clarity(OSS).
- Ensuring Data consistency at all application in BSS layer and OSS layer & performing Data Reconciliation process.
- Generate requirement to Automate operation/process at OMS layer.
- Involving in BPM engine & Server split activity with DBA and Developer.
- Maintain order xmls, logs & OMS portal, and Inconcert server. Checking of Database analyses, Data purging Activities, tuning of heavy loaded Query with help of DBA & Developer.
- Update SOPs, give to team members, and Report generation.
- Our main goal is to make our system Zero Defect, Zero Delay and Zero touch.
Confidential
Sr. Software Programmer
Responsibilities:
- Design & Develop Modules using CTD & SQL Server-2000.
- Develop Report using Centura Report Builder.
- Develop Procedure in SQL which help for application and Report Generation.
- Develop Modules for import and export file to SQL 2000 from Stock Exchange.
- Write Module using Visual Basic to upload file from MS - Excel to Database.
- Update Existing Module as per Requirement