We provide IT Staff Augmentation Services!

Mainframe & Hadoop Developer Resume

5.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • me have 7 years of experience with a solid record of meeting business needs by analysing impact, developing functional design, writing program specification, estimating time for the requirements, coding, creating and implementing test plan and writing appropriate documentation.
  • Being involved in all phases of SDLC and me have handled complex projects in finance and developed applications dat Handles compensation, payroll and Policy administration.
  • Experienced in Gathering the requirements and Designing the functional document and building new COBOL, DB2 and IMS programs.
  • Experienced in working on payroll systems with 6 years of experience working on financial systems in Insurance domain in Mainframe environment and currently working in a project dat handles migration from mainframe to hadoop environment for past 7 months.
  • Experienced in working in multiple LPARs.
  • Having knowledge in Production support and batch processing and CA7 scheduler.
  • Able to analyze problems and resolve them with the least amount of disruption to the existing system by making use of debugging tools like Xpeditor.
  • Good technical knowledge in developing new applications, as well as enhance current systems.
  • Experience in importing and exporting Terabytes of data in to HDFS and using HiveQL for querying and analyzing the data.
  • Delivered multiple projects meeting client requirements by serving as Module lead and as a single point of contact.
  • Good domain knowledge in the Insurance and Financial services.
  • Strong ability in communicating with other application Development Teams and ability to work on multiple projects and prioritize workload.
  • Developed expertise in client interaction.
  • Stronganalytical, problem solving, multitasking and strategic planning skills
  • Very good exposure to all Quality procedures complying with ISO 9002 standards and CMM - Level 5.
  • Team player with good analytical, technical and interpersonal skills.
  • Proficient in MS Word, Excel and outlook, PowerPoint (Charts,tables,graphs).

TECHNICAL SKILLS

Programming Languages: COBOL, JCL,Quick Job, Easytrieve, Linux Shell scripts.

Databases: DB2, IMS DB/DC.

Tools & Technologies: Active Archive, Dbeaver, Xpeditor, File-aid, File-aid IMS, Panvalet, Elipse, FAM, WAM, JSP, Dictionary, Platinum, CSL, Infopac, CA7 Scheduler, Control-M, Document Direct, CMOD, Unicenter (Change management tool), ESP Scheduler.

Hadoop Ecosystem: Hadoop MapReduce, HDFS, Sqoop, HiveQL, HBase, Pig, Flume, Spark.

Cluster Management and Monitoring: Cloudera Manager, Ambari.

Operating systems: Linux, Windows98/NT, OS/2, MVS,Z/OS

File system: VSAM, IAM, Structured Files in Hadoop

Utilities: Putty, Beeline, TSO/ISPF, SPUFI, PRF

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Mainframe & Hadoop developer

Responsibilities:

  • Involved in gathering business requirements in applying the logics on the data dat is to be migrated to hadoop.
  • Involved in the development and configuration of tool and their business logics.
  • Developing COBOL programs to handle the Claims present in the yearly Tape datasets and VSAM files. Analysing the performance of the code due to huge counts of data present in Archived claim tapes.
  • Coordinating with the other interfacing teams for system and performance testing.
  • dis project involves the structured data with the mainframe files and other legacy system files dat accounts to 3TB of data which is one time migration of both and Active and archived claims.
  • Number of nodes used is of 32 data nodes. Responsible for loading the data and applying the business logics such dat making the data available to the downstream frontend systems.
  • Using HiveQL the data are being stored in hive tables and partitioning of tables are done by State and region in order to have easy access for the downstream systems.
  • Responsible for migrating the tables to Prod clusters using Distcp and validating thereby making sure of the consistency and volume of data.
  • HiveQL and pig scripts are used on top of HDFS clusters to get the Claim details and their purge information.
  • Involved in System integration testing and performance testing of the data in order to migrate and query huge load of data.

Confidential, Syracuse, NY

Hadoop developer

Responsibilities:

  • Involved in analysis of end user requirements and business rules and understanding the current system and its data flows.
  • Configuration and sizing of the required cluster and nodes based on the system drive capabilities.
  • dis project involves the structured data with the mainframe files from different admin systems accumulating it to 2TB of data yearly and almost 9 TB of data which is present already in the DB2 table.
  • Number of nodes usage increases for each year with the accumulation of older data. 36 nodes will be used for the first year after implementation. So we are in position to handle the file aggregation strategy.
  • Using Sqoop extracted the data from the existing DB2 table and loaded into HDFS clusters.
  • Using HiveQL the data are being stored in hive tables and partitioning of tables are done by Run date as part of resource managing along with the quality checks.
  • Also the partitioning tables are dropped in hive, based on the minimum retention period of 180days in order to give the read consistency.
  • As part of dis data load, HiveQL and pig scripts are used on top of HDFS to get the policy, claim, mortality rates for end user /analysts requirements to perform adhoc analysis.
  • Developed Oozie workflow for scheduling and orchestrating these scripts.

Confidential, Syracuse, NY

Mainframe Senior Developer

Responsibilities:

  • Analysing on the possibility and impact of introduction of new tax benefit treatment in the existing agent payroll programs in our application.
  • Gathering the requirements from the business clients and breaking the high level requirements to lower level one for development.
  • Responsible for creating the design for the changes in Cobol IMS programs which involves calculation of recovery processing and earnings. Also responsible for the design for changes in recovery reports and commission statement.
  • Since dis project also involved IMS PSB layout changes in Master Agent database in order to assist to the calculation involved in the coding. Responsible for coordinating with the DBA and analysing the impact on recompiling the existing IMS programs.
  • Responsible for coding changes involves complex calculations in 20 impacted modules and 10 modules in Commission processing and GDC calculations.
  • Responsible for Coordinating with the other external application teams including peoplesoft for the interface changes since dis project involves interfacing file changes across all admins. Dealt with 13 admin systems and 63 interfacing feeds and its changes.
  • Responsible for creating project and Unit test plan for the project and being as the point of contact for dis entire project to the Confidential clients.
  • Also involved in coding some complex programs inorder to assist with the offshore for project deliveries.
  • Coordinate with team to run multiple QA and UAT test cylces and coordinating with business for their signoff.
  • Analysing and fixing the QA and UAT defects and troubleshooting the programs by the help of debugger tools like Xpeditor.
  • Preparing deployment checklist and implementation plan to meet the deadline of the project.

Confidential, Syracuse, NY

Mainframe Senior Developer

Responsibilities:

  • Analysing the total impacted components in our Legacy compensation system.
  • Around 85 JCLs and 120 Proc are totally impacted which is to be replaced with our own routine.
  • Responsible for Coordinating with the other external application teams for their interface balancing and Status calls with the Client.
  • Responsible for creating project and test plan for the project.
  • Developing new prototype and building up the logic for the replacement routine
  • Developing the Unit test cases and Test and project plan.
  • Monitoring the Testing activities and suitable Code and Testing documents review
  • Configuring new Jobs required for dis project and Job scheduling activities in CA7 scheduler.
  • Analysing the impact of newly created routines such dat existing business functions are not effected.
  • Responsible in the release activities for these numerous components.
  • Analysing and fixing the QA defects and Troubleshooting the balancing counts
  • Coordinate with team to run multiple QA and UAT test cycles and coordinating with business for their signoff.
  • Preparing deployment checklist.
  • Monitoring and validating the results in the production such dat our balancing counts are in Sync with the actual balancing Infogix proc.

Confidential, Syracuse, NY

Commission Calculation & Agent Payroll

Responsibilities:

  • Gathering the suitable requirement from Client by having multiple meetings and designing the Technical document for the project.
  • Providing the Estimation to the Client based on the efforts needed for the project.
  • Responsible for developing new programs and analysis on existing programs in the system to enhance the programs with adequate changes.
  • Handling the Offshore team as technical lead and managing the deliveries.
  • Developing new Test plan and reviewing the Testing documents.
  • Monitoring the Testing activities and suitable Code and Testing documents review
  • Configuring new Jobs required for dis project and Job scheduling in CA7 scheduler.
  • Developing the JCLs to accommodate the output reports being sent to CMOD.
  • Testing and pushing the reports to CMOD uat region.
  • Analysing and fixing the QA defects and Troubleshooting using Xpeditor.
  • Coordinate with team to run multiple QA and UAT test cycles and coordinating with business for their signoff.
  • Preparation deployment checklist.
  • Verifying the Batch jobs dat run in production and the tables in IMS are populated correctly and is accurate.

Confidential, Syracuse, NY

Commission Calculation & Agent Payroll

Responsibilities:

  • Involved in analysis of the existing system and feasibility study to undergo code changes.
  • Identifying the appropriate cobol programs in the system to make the changes.
  • Involved in developing the code and making enhancements to the existing cobol programs to have the new Term product to get fit in to the system.
  • Coordinating the team as technical lead and managing the outputs.
  • Developing new Unit test plan and reviewing the Testing documents.
  • Running multiple testing QA and UAT cycles for the changes and sharing it to the business.
  • Developing Quickjobs in the JCL required for the project.
  • Pushing the output reports to CMOD region.
  • Coordinating with the Release management to deploy the project.
  • Provide post implementation support and validating the results after the deployment.

We'd love your feedback!