Software Engineer Resume
SUMMARY:
- 14 years of experience in the field of Information Technology handling large datasets
- 5 years of hands on in Hadoop Big Data projects
- Proficient in all major flavors of Unix
- Hands on experience in setting up of multi node Hadoop clusters using Hortonworks distro HDP 2.0 hosted on Amazon EC2 Cloud
- Proficient in Hadoop Data Lab setup
- Well versed with Scripting languages such as Unix Shell Scripting, AWK, Pig and Confidential
- Strong working knowledge of Autosys Job Scheduler
- Familiar with middleware technologies such as IBM Websphere MQ and Tibco
- Well acquainted with the ITIL processes
- Extensive experience in Application Management
- Good exposure to Retail and Investment Banking domains
- Competent in team management
TECHNICAL SKILLS:
Operating Systems: AIX, Sun Solaris and Linux
Languages: Core Java, Scala
Scripting Languages: Perl and Unix Shell Scripting
BigData Tech: Spark, HDFS, MapReduce, Hive, Pig, HBase and Sqoop
Middleware: IBM WebSphere MQ 7.0
Database: Oracle and Sybase
Web Technologies: HTML, JavaScript and CGI
Tools: BMC Remedy Problem, Incident and Change Management, Autosys and Ctrl - M Job scheduler, Merant Dimensions and IBM Mainframe Endevor Version Control
PROFESSIONAL EXPERIENCE:
Confidential
Software Engineer
Responsibilities:- Bronze award for the successful Hadoop sourcing - Confidential
- Excellence Award for the successful and quick transition of the project from the vendor to Confidential
- Nominated for the excellence award - Confidential IT Division - for the successful handling of the critical support.
- Awarded the Confidential monthly Summit award in August 2009
- Awarded for the successful transition of the project from onsite to offshore at Confidential
Confidential
Architect and Development Lead
Responsibilities:
- Set up Hadoop Data Lab to store and analyze Trade Finance data from the Financial Crime and Compliance FCC source systems
- Own the Data Lab in its entirety
- Enable Beeline SQL based authorization and restrict users using ACLs
- Perform Data Mapping and Create Red Flag entities
- Automate Data loading by Sqoop into the Hive Database using Oozie
- Work with SAS technical team to connect the Hadoop Data Lab to the SAS server to perform analytics at the SAS end
- Automate report generations especially the reconciliation reports
- Set up NAS storage to store the data as part of the DR activity
- Educate Business Analysts from various project teams to access data from the Hadoop Data Lab
Confidential
Architect and Developer
Responsibilities:
- Responsible for writing Linux Scripts, setting up Autosys jobs, writing Pig Scripts, Hive queries, Oozie workflows and developing Map Reduce programs, as required.
- In addition to development work, participate in analysis and design, support activities and complete project documentation.
Confidential
Team Lead
Responsibilities:
- Set up multi node clusters for training and development
- Perform and document sanity checks after each successful cluster setup
- Validate the cluster for its stability
- Automated loading of weather data from Confidential onto Confidential for analysis
- Ownership of starting and shutting down of the AWS clusters
- Bulkloading HBase using Pig and Importtsv - CompleteBulkload
- Create Hive tables to access and analyze data stored in HBase
Confidential
Org Manager
Responsibilities:- Installed and configured multi node Hadoop cluster of Linux test boxes
- End of day POS Tlog files, the Holy Grail of retail information, from all the stores were cleansed and loaded into the Confidential
- Performed association mining with the given dataset. Processed the complete data using Confidential to identify the items that are associated with each other
- Obtained critical outputs such as total transactions for each store, total transactions product wise, total inventory across all stores, total sale based on demographics by processing the same data using HiveQL and analyzed the results
- Managed the application entirely from offshore
- Performed batch processing involving interaction with upstream and downstream systems for bad or no feeds
- Primary owner of 100s of production and staging jobs
- Manage the offshore team to provide uninterrupted support
- Manage all the change requests assigned to the team
- Manage new interface requests and create SRs for new requests, estimate the level of effort, get UAT approval from the client, implement the production release and perform post implementation activities
- Manage work allocation
- Review all the incident reports raised by the team before publishing it
- Provide inputs to the team for the enhancement of the support
- Develop Shell scripts for monitoring, reporting and creating alerts
- Impart training to resources and decide priorities of the tasks
- Evaluate in-house tools/applications developed by other teams to improve the quality of the support
- Analyze and arrange project and process related trainings to the team
- Take care of the team’s professional development plans
Confidential
Team Lead
Responsibilities:
- Provided 12X5 offshore support for batch processing
- Interacted with upstream and downstream systems for feeds, escalated issues to developers and other support teams in case offshore team was unable to rectify
- Owned about 15 Autosys batches (Production and Staging)
- Managed the offshore team in providing uninterrupted support during the stipulated support hours
- Responsible for Signing off on the incident reports (affecting core business deliveries) raised by the team before publishing it
- Sent daily status report to the onsite project manager
- Appraised the onsite manager on issues before handing over the support
- Organized meetings with all the batch owners once in 2 weeks to update/get to know about the batch issues
- Developed Confidential and Shell Scripts for monitoring, alerts and reporting
- Developed Confidential CGI scripts to maintain a web based support utility page
- Imparted training to resources and decided task priorities
- Evaluated any in-house tools or applications developed by other teams to improve the quality of the support
Confidential
Team member
Responsibilities:
- Provided primary support to the ADE users across the global Confidential branches
- Performed root cause analysis on the recurring issues using the Confidential debugger and fixed the bugs
- Performed screening of the bugs to find out if they reproduce in the latest version
- Reviewed the developer’s transactions (fixes) before they were merged onto the main line
- Managed the Knowledge base/FAQ on a regular basis with any new issues and their resolution
Confidential
Team member
Responsibilities:
- Performed continuous batch monitoring to check for any failures
- Monitored jobs which took significantly longer time to finish and reported the delay to the end users
- Analyzed the job failures and rectified them at the first level
- Performed daily batch status reporting to the client
- Notified the risk managers and users for any business server and downstream report delays
- Performed position and sensitivity count checks and reported significant differences if any to the client
- Followed up with the feeder system for any feed delays
- Controlled the staging batch for testing purpose
- Involved in developmental activities
- Raised incident reports for core business issues
- Responsible for restarting the Front-End CORBA servers and VARCALC servers if they were down