- A competent programmer with profound noledge and skills in computer programming with zeal to face the challenges and willing to work in competitive environment.
- Strong IT Professional with 13+ Years of programming experience and several years with Big Data and Big Data analytics.
- Experienced in installing, configuring Hadoop cluster of major Hadoop distributions.
- Has hands on experience in writing Map Reduce jobs
- Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce,Python.
- Strong Knowledge of Hadoop and Hive and Hive's analytical functions.
- Hands on experience in installing, configuring and using ecosystem components Hadoop Map Reduce, HDFS, HBase, Zoo Keeper, Oozie, Hive, Cassandra, Sqoop, Pig, Flume.
- Successfully loaded files to Hive and HDFS from MYSQL.
- Loaded the dataset into Hive for ETL Operation.
- Good noledge on Hadoop Cluster architecture and monitoring the cluster.
- Extensive noledge and experience in COBOL,CICS,JCL,SQL,DB2 v10,VSAM,IMS DB, FILE - AID, TSO/ISPF,CA7 (Scheduler), CHANGEMAN, Endevor, Rex, Expediter,INFOMAN, SPUFI,IBM UTILITIES, DB2 Stored-procedure.
- Certified PSM me Scrum Master
- Extensive noledge on Unemployment insurance Tax system and legacy system modernization
- Excellent Project Management skills like Estimation, Planning, Execution, Control and Delivery and was responsible for all deliverables and worked on providing cost-effective solutions with halp from Project Managers
- Results oriented, strategic thinker, highly analytical with record of increasing application performance and reducing costs--excellent with both technical and business matters
- Excellent noledge of IT Infrastructure Library .ITIL V3 certified professional.
- Knowledge of Service Oriented Architecture (SOA) and web services. Transferred Files and reports to open system.
- Managed production projects, enhancement and supports for RBS application management projects with Incident Management and Problem Management skills for all customer facing applications
- Project Communications management experience includes determining the information and communications needs of the project stakeholders. Collecting and distributing performance information, including status reporting, progress measurement, and forecasting.
- Extensive experience in design, development, testing, production support and implementation of IBM mainframe related Applications. Development life cycle experience including: Analysis, Design, Coding, Testing and Quality Control of various software projects.
- Proven strength in problem solving, coordination and analysis. Strong communication, interpersonal, learning and organizing skills matched with the ability to manage stress, time and people effectively.
- Played role of a Team Leader and handled activities such as work planning, allocation, tracking, reviewing and testing.
- Hands on experience in Hadoop, HDFS file system, PIG, HIVE, HBASE, Zookeeper SQOOP
Operating System: Z/OS, OS/390, Windows, UNIX
Languages: Cobol 390, Cobol 2, JCL, JES2, C, SQL, Python
Hadoop Ecosystem: Hadoop MapReduce, HDFS, Flume, Sqoop, Hive, Pig, Oozie, Cloudera Manager Zookeeper.
Data Base: DB2 v9/10, IMS DB
Tools and Utilities: XPEDITOR, SPUFI, FILE - AID, FIXIT, Endevor, MS Visio,CICS
Quality Systems, Maryland Hadoop Programmer
- Create, validate and maintain scripts to load data using Sqoop manually.
- Create Oozie workflows and coordinators to automate Sqoop jobs weekly and monthly.
- Worked on reading multiple data formats on HDFS using Scala.
- Running reports in Pig and Hive.
- Develop, validate and maintain HiveQL queries.
- Fetch data to/from HBase using Map Reduce jobs.
- Writing Map Reduce jobs.
- Running reports in Pig and Hive Queries.
- Analyzing data with Hive, Pig.
- Designed Hive tables to load data to and from external tables.
- Run executive reports using Hive and Qlik View.
- Involved in collecting and aggregating large amounts of streaming data into HDFS using Flume and defined channel selectors to multiplex data into different sinks.
- Implemented complex map reduce programs to perform map side joins using distributed cache
- Designed and implemented custom writable, custom input formats, custom partitions and custom comparators in Mapreduce.
- Responsible for troubleshooting issues in the execution of Mapreduce jobs by inspecting and reviewing log files
- Converted existing SQL queries into Hive QL queries.
- Effectively used Oozie to develop automatic workflows of Sqoop, Mapreduce and Hive jobs.
- Exported the analyzed data into relational databases using Sqoop for visualization and to generate reports for the BI team.
- Developed Hive (version 0.10) scripts for end user / analyst requirements to perform ad hoc analysis
- Working with application teams to setup new Hadoop users and setting up and testing HDFS, Hive, Pig and Map reduce access for the new users
- Involved in developing the Pig scripts. Solved performance issues in Hive and Pig scripts with understanding of Joins, Group and aggregation and how does it translate to MapReduce jobs.
- Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.
- Involved in resolving the JIRAs based on Hadoop.
- Raw Html data extracted from Code Repository using PIG Regular expressions and used in Hive ExternalTable using Dynamic Path ( Extern').
- Moved all crawl data flat files generated from various retailers to HDFS for further processing.
- Using Sqoop exported Hive external output processed data into Mysql, imported data from legacy system to HDFS
- Participate in the recovery of production incidents and their permanent resolution through problem management. Used HP service manager tool for Incident/problem incidents in live production environment and handled multiple recoveries to recover the banking system
- Review, analyze and modify programming systems inclusive of encoding, debugging, testing, installing and documenting programs
- Collaborate with Project Managers, business partners, and developers to coordinate project schedules and timelines.
- Work with business partners, analysts, developers and project managers to develop test plans, produce test scenarios and repeatable test cases/scripts through all parts of the development lifecycle, execute and sign-off for a high volume of data and regular release schedule for successful project delivery.
- Worked are a cost-effective solution architect in determining the modification/impacts to the system infrastructure/framework and solutions to the business requirements.
- Worked with Business users and analysts to determine business and technical requirements and translating them to design (high and low level) and work with Project Managers for cost-estimation
- Technical Design (HLD and LLD) of the Mainframe components of the project (especially Cobol modules, JCL, DB2 and CICS)
- Responsible for creating the business process flows and dataflow diagrams
- Analysis of requirements provided by the clients and current system state to propose innovations for continuous system improvement.
- Transitioning and supporting infrastructure in “Business As Usual” Mode and ensuring high levels of customer satisfaction
- Incident and problem management, Support the development team in code reviews and defect tracking. Implemented SDLC phases for all the work efforts and document phase-associated deliverables.
- System and Integration Testing. Maintenance and Support for the project. Incident and Problem Management (tools used: HP Service Manager
- Coordinated Production Monitoring, Fine tuning of application programs with the halp of DBA.
- Remodeled legislative processes to fine-tune performance and accurately report financial data