Hadoop Developer, System Analyst Resume
Bloomington, IL
SUMMARY
- Over 8.7 years of experience in IT and strong experience in Application development, Data Analytics and Hadoop Platform in various industries like Financial Markets, Insurance sectors & Credit Rating.
- Expertise in core Hadoop and Hadoop technology stack which includes HDFS, Map Reduce programming, Sqoop, Hive, HBase, Oozie, Pig, Hue, Hcatalog etc.
- Experience in Project Delivery, Analysis and Design of the system, Quality process, Software Development.
- Possess very good Data Analysis and Data Validation skills.
- Strong understanding & hands on experience managing the clusters and building big data applications on top of hadoop clusters using Kerberos for Hadoop Security
- Having good knowledge about life insurance in Insurance domain and Financial Markets.
- Extensive experience with analysis, design, development, customizations and implementation of software applications.
- Proficient in analyzing and translating business requirements to technical requirements and architecture.
- Having good experience in handling technical team and have excellent problem solving, communication and coordination skills.
- Excellent communication skills, interpersonal skills, self - motivated, quick learner, team player.
- Ability to work well in both team and individual environment and Capability to adapt to new tools and applications.
- Cloudera Certified Hadoop Administrator.
- LOMA 280 Business Domain .
TECHNICAL SKILLS
Hadoop Expertise: Hive, Pig, Core Java, Map reduce, HDFS, Sqoop, HueHcatalog, Flume and HBase
Languages: Java, COBOL, JCL, Base sas & Macros, SQL, MainframeAssembler
Databases: DB2, Postgres database, MYSQL.
Operating System: Z/OS, MVS
Specialized Tools: Tableau, IBM Data Studio, SPUFI, Abend-aid, File - aid, XP-editor, Command editor, PgAdmin III, Panvalet, Librarian,ISPF, QMF, Xp-editor Code Coverage, DFsort, Easytrieve,Strobe, Timekeeper, IBM db2 Command-utilityProcesses IBM s QMS (Quality Management System) concepts, CMM &PCMM awareness, Metrics & measurements, Lean Six Sigma.
PROFESSIONAL EXPERIENCE
Confidential, Bloomington, IL
Hadoop Developer, System Analyst
Responsibilities:
- Understand business needs, analyze functional specifications and map those to mules and web services of the existing applications to insert/update/retrieve data from HBase
- Installed and managed a 4-node 4.8TB Hadoop cluster for SOW and eventually configured 12-Node 36TB cluster for prod and implementation environment.
- Implemented Hive tables and HQL Queries for the reports.
- Writing Serdes and used JSON data type in Hive. Developed Hive queries to analyze reducer output data.
- Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables.
- Strong understanding and hands on experience in setting up Kerberos Security on hadoop clusters.
- Involved in troubleshooting the issues, errors reported by cluster monitoring software provided by Cloudera Manager
- Creating simple rule based optimizations like pruning non d columns from table scans.
- Setting Task Status to display debug information and display the status of the map reduce job on job tracker web page.
- Used Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data on the daily catch up run basis.
- Configure a cluster to periodically archive the log files for debugging and reduce the processing load on the cluster and tune the cluster for better performance.
- Involved in Extracting, loading Data from RDBMS to Hive using Sqoop.
- Tested raw data, executed performance scripts and also shared responsibility for administration of Hadoop, Hive and Pig.
- Involved in the design of the unstructured JSON data format and building required Serdes for the webservice.
- Created Puppet scripts to deploy software updates and code deployments.
- Increasing the performance of the hadoop cluster by using hashing and salting methodologies to do load balancing.
- Optimizing the HBase service data retrieval calls native to region and improving range based scans.
- Be responsible for the quality and timeliness of the Deliverables and facilitating the quality review process.
- Maintaining the financial information by making secured calls to the vendors on sensitive data information.
- Highly involved in designing the next generation data architecture for the unstructured data.
Environment: DB2, Postgre SQL, MySQL, Expression and HBase,HDFS, Hadoop Map Reduce, Zookeeper, Hive, Pig, Sqoop, Oozie, Cloudera CDH-4, HUE, Flume Impala, Micro Focus Rumba, Dataflux, SPUFI, PgAdmin-III and IBM data studio.
Confidential, Bloomington, IL.
Hadoop Developer, System Analyst - Web
Responsibilities:
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBase and Sqoop.
- Responsible for building scalable distributed data solutions using Hadoop.
- Involved in loading data from LINUX file system to HDFS.
- Worked on installing cluster, commissioning & decommissioning of data node, name node recovery, capacity planning, and slots configuration.
- Created HBase tables to store variable data coming from different portfolios.
- Implemented a script to transfer information from DB2 to HBase using Sqoop.
- Worked on tuning the performance of Pig queries.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team
- Involved in loading data from UNIX file system to HDFS.
- Load and transform large sets of structured, semi structured and unstructured data.
- Cluster coordination services through Zookeeper.
- Experience in managing and reviewing Hadoop log files.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
- Optimized Map Reduce Jobs to use HDFS efficiently by using various compression mechanisms
- Analyzed the customer behavior by performing click stream analysis and to inject the data used flume.
- Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
Environment: DB2, PostgreSQL, MySQL, Expression and HBase, HDFS, Hadoop Map Reduce, Zookeeper, Hive, Pig, Sqoop, Oozie, Cloudera CDH-4, HUE, Flume Impala, Micro Focus Rumba, Dataflux, SPUFI, PgAdmin-III and IBM data studio.
Confidential, Bloomington, IL.
Mainframe Dev lead, System Analyst - Web
Responsibilities:
- Involved in defining the Business requirement specifications and Functional specifications by directly interacting with the end-users and proposing corresponding Technical specifications for built.
- Established reusable components and implemented best practices and standards.
- Prepared the testing strategy for the converted modules and ensured the existing functionality worked as expected along with mapping.
- Worked on implementation activities like packlist updates, proc changes, ordered movement of sub modules and copybooks to production before moving actual code to production.
- Identified and tracked defects the defects in testing and analyzed the root cause of defects and resolved them by understanding the current functionality which was in Assembler format.
- Developed specific jobs using JCL to run the batch related to the conversion effort.
- Provided support for implementation and was responsible for resolving post production defects.
- Created job aid for all the converted modules along with component specifications.
- Involved in enhancements and maintenance activities of the conversion effort including performance tuning, writing of stored procedures for code enhancements, creating tables, and modifying target codes.
Environment: DB2, PostgreSQL, MySQL, Softwares and Tools: Assembler, COBOL, JCL, VSAM, CICS, DB2, Data Studio, Code Coverage Analyzer, Data Migration tool, File - aid, XP-editor, CA7 and Panvalet.
Confidential, Bloomington, IL.
Mainframe Dev lead, System Analyst - Web
Responsibilities:
- Gathering and evaluating Business Requirements, creating component and design specifications. Implementing the requirements and Unit testing.
- Effort Estimation for all project phase and Evaluated the Business Requirements, created detailed design and component specifications for impacted components
- Preparation of Data conversion strategy for the application which need to migrated in to COBOL.
- Preparation of the testing strategy for the converted modules and ensuring the existing functionality works as expected.
- Identify and track the defects in testing and analyze the root cause of defects and resolve them by understanding the current functionality which is written Assembler.
- Being responsible for the overall quality and timeliness of all deliverables, Preparation of maintenance strategy for the application which migrated to COBOL.
- Track and report the current status of the project. Foresee the project status and updating prior to client for better solutions.
- Identifying the pain area to avoid the bottle neck situation in project and Establish and monitor the process for Change Management and Quality Management.
Environment: DB2, PostgreSQL, MySQL, Assembler, COBOL, JCL, VSAM, CICS, DB2, Data Studio, Abend-aid, Code Coverage Analyzer, Data Migration tool, File - aid, XP-editor, Inter-test, CA7, Change man, ESP and Panvalet.
Confidential
Mainframe Dev lead, System Analyst - Web
Responsibilities:
- Mainframe Data center support for the daily batch cycle process, Gathering and evaluating Business Requirements, creating component and design specifications.
- Handle service call requests to force start UNIX jobs, mark complete the jobs, putting jobs on hold, terminating jobs and monitor AS400 backups.
- Monitor Production, Corp and development CA7 for mainframe job abends. Enhancement of mainframe modules and performing temporary batch job changes as per customer’s request.
- Request for mainframe jobs through ESP includes; rerun the failed jobs, Analyzing & Fixing the Mainframe Job Abends during Batch run.
- Evaluate the business requirements, create design specifications and manage the conversion activities.
- Be responsible for the overall quality and timeliness of the deliverables, On-site - Offshore Coordination.
- High level design, low level design and sequence diagrams for the required feature implementation.
Environment: DB2, PostgreSQL, MySQL, COBOL, JCL, DB2, CA-7, ESP Work Station, Auto Sys, HP-Open view, AS-400.
Confidential
Mainframe Developer
Responsibilities:
- Preparing Business Requirements, creating component and design specifications and implementing the requirements.
- Interacted with Business Analyst to understand the business requirements.
- Monitored jobs which were running in CRIS/ICRIS module and provided off shore support in batch cycle and also provided temporary fix for batch abends.
- Worked on Coding and testing programs and JCL when needed in Production.
- Prepared Business cases involving new one time programs and changes in existing programs.
- Worked on production defects related to batch issues and provide fix for the issue.
- Analyzed on user queries and created change man packages for installation in production.
- Worked on Technical Initiatives and develop High level design, low level design and sequence diagrams for the required feature implementation.
- Involved in Production support and trained the other developers to handle issues.
- Involved in project planning and coordinating business, source and development teams to meet the project deadlines.
Environment: COBOL, JCL, VSAM, CICS, DB2, ESP, Abend-aid, File - aid, XP-editor, Inter-test and Change man.
Confidential
Mainframe Developer
Responsibilities:
- Monitored jobs which are running in RDS system with the help of Control M
- Provided off shore support for RDS system in batch cycle and also provide temporary fix for batch issues.
- Provided Batch Processing request for resolving batch abends.
- After execution of batch cycle, analyze critical batch jobs, run backups and provide permanent fix for batch abends.
- Worked on coding and testing programs and JCL when needed in Production.
- Worked on online abends and enhancements and provide resolutions for severity 2 and 3 tickets
Environment: COBOL, JCL, VSAM, CICS, DB2, Data Com, Roscoe, ENDEVOR, INTERTEST, File-Aid, XPeditor, ABENDAID, QMF