Hadoop Developer / Spark Developer Resume
Chicago, IL
SUMMARY:
- Around 13 Years of IT experience in in Bigdata (Hadoop) Ecosystem, Java, Oracle, Unix and Mainframe related technologies.
- Extensively worked on HDFS, Apache PIG, HIVE, NIFi, Spark (Spark Streaming and SparkSQL), SCALA, SQOOP, KAFKA, HBase, IMPALA and Oozie components and Extensive knowledge in MapReduce, HCatalog, Mango DB, FLUME, Python etc.
- Extensively worked on CloudEra platform and worked on Cloudera Manager, Cloudera Navigator, Sentry, Hue, Kerberos etc.
- Worked on EBCDIC, XML, JSON, VSAM and Parquet Files.
- Expertise in ETL and ELT. Excellent implementation knowledge on SCD and CDC
- Experienced in code implementation in staging environments and maintenance and Production implementation and post implementation support.
- Experienced in using agile methodologies (SCRUM and Kanban) and Waterfall methodologies.
- Well versed knowledge on SaFE, Agile tracking tools VersionOne and TFS.
- Excellent knowledge on grooming the requirement, Release Planning and creating the User Stories
- Expertise in the areas of Business Analysis, Data Analysis, Data Modelling, System Design, Application Development, Maintenance, Production Support, Decommissioning and Re - engineering Projects in Hadoop and Mainframe environment.
- Expertise with full Software Development Life Cycle (SDLC) and documentation including requirements gathering, analyzing the project requirements, analyzing the existing systems, Detailed Designing, Writing of Technical Design Documents, Development, Testing of the applications.
- Expertise in Mainframe batch and online applications using COBOL, JCL, DB2, CICS, VSAM, IMS DB, MQ Series etc.
- Excellent SQL Developer skills including Stored Procedures, Indexed Views, User Defined Functions and Triggers
- Knowledge in job Scheduling, Performance Tuning, Normalization/De-normalization concepts, database design methodology.
- Excellent Organizational, Analytical, Teamwork skills along with good communication and interpersonal skills, very good team player, Ability to handle tasks independently.
- Experience in Healthcare domain, Auto & Home Insurance, Banking and Financial Services.
TECHNICAL SKILLS:
Operating Systems: LINUX, MVS, WINDOWS, MS-DOS, UNIX, SOLARIS
Hardware: IBM Mainframe, CISCO Servers, LANs, PC compatible
Languages: SQL, Core Java, COBOL, SCALA, PL/1, EASY TRIEVE, JCL, PL/SQL
OLTP: CICS, IMS/DC
Databases: Hive, HBase, Impala, DB2, IMS DB, Oracle
File Systems: HDFS, VSAM, XML, EDI Files, JSON, Parquet and EBCDIC
Big Data: Hadoop EcoSystem (HDFS, MapReduce, Nifi, PIG, Hive, HBase, SQOOP, OozieZookeeper, Spark (Streaming and SparkSQL), Flume, KAFKA), MangoDB, Impala, etc.
Big Data Distribution: CloudEra and Hortonworks
Middleware: DB2 Connect, MQ Series
ETL: IBM SAFR
Batch Schedulers: CA-7, Control- Confidential, Oozie and Autosys
Configuration Management: VSTS, SubVersion, GIT, Changeman, Endeavor and RMS
Incident Management: Service Manager, Clarify, Remedy
Others: MS Office, Abendaid, ISPF, Syncsort, Clarity, BMC, Panvalet, VISIO etc
PROJECT EXPERIENCE:
Confidential, Chicago, IL
Hadoop Developer / Spark Developer
- Importing and exporting data into HDFS from DB2, Teradata and vice versa using Sqoop. Responsible to manage the data coming from different sources.
- Analyze the requirements, groom the requirement, help to create user stories grooming, and enter into VersionOne.
- Participated in Release planning, Sprint ceremonies and Daily Stand-ups Responsible for design and development of Big Data applications using Cloudera Hadoop.
- Creating Hive tables, loading with data and writing Hive queries that will run internally in map reduce way.
- Create the tasks and detailed the technical documentation for each user story assigned.
- Design, Build, Test, Schedule and Deploy the Oozie workflows.
- Design, Build, Test and deploy Spark SQL programs in SCALA and implement
- Design, Build, Test and deploy Hive Scripts, Shell Scripts
- Define and Establish Production Support process Bigdata Platform
- Help the team on Debugging and interact with Cloudera and Infrastructure team for any issues.
- Attend the SCRUM meetings and provide the status to Scrum Master and back up to Scrum Master.
Environment: Hadoop 2.x, Pig, HDFS, Scala, Spark, Sqoop, HBase, Oozie, Java, GIT, HBase, Putty, Tableau, Map Reduce, HIVE, Teradata, MySQL, SVN, Zookeeper, Linux Shell Scripting.
Confidential, Chicago, IL
Hadoop Developer / Spark Developer
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hive and Sqoop.
- Establish connections with different source systems and Analyze the Source systems and finalize the requirement.
- Analyze the existing Informatica Workflows &Design the technical architecture and application customization
- Prepare the design documents (FSD, TD, Test Plan, Deployment plan)
- Coding, Test Cases preparation, Unit testing and Integration testing (HDFS, HIVE, PIG, SQOOP and OOZIE)
- Define the Hive DDLs, Teradata DDLs & workflows and Schedule.
- Involved in using different kind of compression techniques to save data and optimize data transfer over network using Snappy in Hive tables.
- Design and write Spark programs in SCALA and implement
- Help the team on Debugging and interact with Cloudera for issues.
- Created Oozie workflow engine to run multiple Hive jobs.
- Attend the SCRUM meetings and provide the status to Scrum Master and back up to Scrum Master.
Environment: HDFS, Cloudera (CDH 4), Hive, Impala, Hbase, Scala, Spark, Flume, Sqoop, Java, Eclipse, Teradata, MongoDB, Ubuntu, Linux.
Confidential, Chicago, IL
System Analyst
- Design, development, unit testing, supporting system testing, resolving production issues and post production support of mainframe host batch / online processing of integrated customer platform applications using COBOL, JCL, CICS, IMS, DB2, MQ series
- Participate in the meetings with Business users to gather business requirements, discussing about the pending requirements and set the priorities for the requirements
- Involved in design, coding and testing of COBOL/DB2 Stored Procedures
- Prepare / modify JCLs and PROCs for special requirements i.e. data fixes, deletion of data based on retention date.
- Used the Service Management tool HP Service Manager to support Incident, Problem, Change, Configuration and service level management processes
- Participated in Code walkthroughs and test case review meetings
- Attending the weekly status meetings for project status updates
- Analyzed the DC 2/3 online and batch applications, Prepared the High Level Design Documents, Component Specifications, Developed and tested the changes and documented the test cases and test cases results and conducted the reviews using the COBOL, PL/1, XML, JCL, DB2, CICS, IMS
- Wrote unit test plans, integration test plans
- Responsible for coding and implementation of Auto flow charts.
- Used the Service Management tool HP Service Manager to support Incident, Problem, Change, Configuration and service level management processes
- Participated in Identifying poor performing SQL queries mainly to 24*7 batch processing jobs by getting the STROBE statistics of DB2 queries and improved the performance of the long running DB2 queries
Environment: MVS, COBOL, CICS, PL/1, EASY TRIEVE, JCL, DB2, IMS, VSAM, ENDEVOR, FILE-AID, SPUFI, IBM OPTIM, QMF, SUPER C, SDSF, IDCAMS, VISIO, SERVICE MANAGER, FILENET, Control Confidential .MQ Series, JAVA, WEB SERVICES
Confidential, Bloomington, IL
System Analyst
- Design, development, unit testing, supporting system testing, resolving production issues and post production support of mainframe host batch / online processing of policy transactional data store applications using COBOL, JCL, CICS, DB2, IMS
- Primary work involves support, design, development of payments for the Payment Centers, On-line applications, Statefarm.com bill payments, NECHO, VRU, 3rd party payments, P&C Claims
- Involved in Design, Coding and Testing of COBOL/ EASY TRIEVE /JCL/ IMS/DB2 Batch and Online Programs
- Analyzed the Internal Cash online and batch applications, Prepared the High Level Design Documents, Component Specifications, Developed and tested the changes and documented the test cases and test cases results and conducted the reviews using the COBOL, JCL, IMS, DB2, MQ series
- Used the Service Management tool HP Service Manager to support Incident, Problem, Change, Configuration and service level management processes
- Developed front end templates using HTML, DOM, Java Script for Adoptive Assistance (AD) Function Transfer, Split or Merge.
- Participated in enhancements to payment center ATS application screens / modules, online payments loads applications using the COBOL, DB2, IMS, MQ
- Involved in coding and testing SAFR views and Fetchables.
- Presented Design specs, test cases and test results in code review session with the entire team.
Environment: MVS, COBOL, PL/1, EASY TRIEVE, REXX, SAFR, JCL, TSO/ISPF, DB2, IMS, ENDEVOR, IBM OPTIM, FILE-AID, SPUFI, QMF, SUPER C, IDCAMS, VISIO, SERVICE MANAGER, FILENET, Control Confidential, MQ Series
Confidential, Phoenix, AZ
Team Lead
- Preparing Technical specifications and Program specifications as per the business requirements.
- Preparing Unit Test plan documents, Coding and Unit Testing.
- Implementation of the changes.
- Loading data into DB2 tables and using SUPFI, QMF for testing.
- Technical guidance to the team.
- Modifying the existing COBOL/CICS/DB2 programs in both online and batch.
- Modifying and running the BATCH jobs for generating user specific reports.
- Helping the client in preparing System test data and system testing.
- Code walk through were conducted to analysis the newly developed system before and after the testing to evaluate the system for meeting the Software engineering quality metrics namely effectiveness, efficiency, reliability, security etc.
- Data Cleaning and purging of the data from the recovery management system
- Participated in dress rehearsals for data validation after mock static and financial conversion.
- Support the User Acceptance Testing.
- Providing 24/7 Production support.
- Emergency maintenance - Providing primary support to the application, which includes solving production Abends.
Environment: COBOL, CICS, JCL, EASYTRIEVE, MQ Series, VSAM, DB2, QMF, SPUFI, SDSF, BMS, ABEND-AID, IDCAMS, INTERTEST, MS-EXEL, MS-WORD, FILEAID, MVS, FILE-AID, EXPEDITOR, CHANGEMAN, TSO/ISPF, SAR, STORED PROCEDURES, PL/SQL, ORACLE.
Confidential
Mainframe Developer
- Participated in analyzing and understanding the business requirements
- Analyzed and documentation of the TELON based applications flows
- Participated in preparation of the Program Specifications for TELON to COBOL/CICS conversions project, Branch Expansion Project and LVC method changes project including the business and technical requirements
- Wrote technical specifications for both online as well batch programs
- Participated in Understanding of the existing TELON code programs of the PEP systems
- Participated in developing of the PEP systems in online as well as in batch
- Coded the new programs in CICS / COBOL for PEP online systems and enhancements to PEP
- Modified the batch programs for LVC changes and Branch Expansion Changes, Wrote BMP, DL/1 programs
- Developed new CICS programs and screens using BMS macros for online Applications PEP system and Generated new reports using Easytrieve and modifications existing Easytrieve programs
- Participated in Unit testing of the new and enhancements programs, Integration testing of all impacted applications
- Wrote unit test plans, integration test plans
- Participated in code review of the programs, reviewing of the test plans and test results
- Participated in releasing the programs into staging environments using ENDEVOR
- Wrote new reports programs, modified existing reports programs using SAS, Easytrieve
Environment: IBM-3090, OS/390, MVS/ESA, COBOL/370, JCL, CICS, IMS/DB, IMS/DC, VSAM, DB2, TELON, CA7, MVS, TSO/ISPF, SYNCSORT, FILE-AID, BMC, QMF, SPUFI, XPEDITER, CHANGEMAN, ABEND-AID, FOCUS, SPUFI, QMF, FTP, IDCAMS, UNIX, POWERBUILDER, MS Visio, MS Word, MS Excel, REXX, CLIST, MPP, DL/1, BMP, MPP.