Big Data Hadoop Developer Resume
Roseville, CA
SUMMARY:
- 12+ years of overall experience in Enterprise Application Development in diverse industries which includes hands on experience in Big data ecosystem related technologies and working on Java, Mainframes technologies, Linux and Windows platforms.
- Over 2 and half years of experience in Hadoop development using HDFS, Map Reduce, Pig, Hive, SQOOP, Flume, Kafka, Spark, Scala, Yarn, Hbase, Jetty, Zookeeper, to include designing, developing and deploying n - tired and enterprise level distribution applications.
- Experience in working with MapReduce programs using Apache Hadoop for working with Big Data.
- Working experience on analytics, designing and implementing complete end-to-end Hadoop Infrastructure including Pig, Hive, Sqoop, Oozie, Flume, Apache Spark, Hbase, Kafka, Jenkins, Zookeeper, Cloudera Manager, Hortonworks Distribution and Hue.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
- Hands on experience in application development using core Java, RDBMS, and Linux shell scripting
- Extending Hive and Pig core functionality by writing custom UDFs.
- Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java.
- Familiar with Java virtual machine (JVM) and multi-threaded processing.
- Worked on NoSQL databases including HBase and Cassandra.
- Knowledge in job workflow scheduling and monitoring tools like oozie and Zookeeper
- Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
- In depth knowledge of databases like DB2, Teradata, Oracle 8i/9i/10g/11g, MySQL and extensive experience in writing SQL queries, Stored Procedures, Triggers, Cursors, Functions and Packages.
- Techno-functional responsibilities include interfacing with users, identifying functional and technical gaps, estimates, designing custom solutions, development, leading developers, producing documentation, and production support.
- Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.
- Experience in managing and reviewing Hadoop log files.
- Very good experience in complete project life cycle design, development, testing and implementation of Client Server and Web applications.
- Involved writing Map Reduce programs to parse the logs which are stored in HDFS.
- Working experience in writing user defined functions (UDFs) in Hive and Pig scripting.
- Strong data base knowledge and SQL performance tuning queries.
- Experience in requirement analysis, system design, development and testing of various software applications.
- Experience in working with BI team and transform big data requirements into Hadoop centric technologies.
- Experienced and effective in visualizing and implementing creative solutions to the most complex requirements.
- In depth knowledge of databases like DB2, Teradata, Oracle 8i/9i/10g/11g, MySQL and extensive experience in writing SQL queries, Stored Procedures, Triggers, Cursors, Functions and Packages.
- Preparation of Technical Specifications for the business needs, Walkthroughs, coding and unit testing, Integration testing and Code promotions.
- Agile development environment including Scrum methodology. Expertise to follow Agile process in application development. Good knowledge on Agile Methodology and the scrum process. Participated in grooming session, planning and sprint Retro.
- Experienced application developer in multiple platforms and languages.
- Involved in several successful application developments delivered several large-scale, mission-critical projects on time, using leading-edge technologies.
- Experience as a Senior Programmer in all phases of SDLC including Requirements gathering, Application Design, Development, Testing, Maintenance and Supporting.
- Quick learner and excellent team player, ability to meet tight deadlines and work under pressure.
- Created program and system test data, test procedures and test scenarios in order to perform and/or coordinate the testing of new/existing features of the Systems.
- Expertise in problem solving, communications and time management skills.
- Strong debugging and problem solving skills.
TECHNICAL SKILLS:
Hadoop/Big Data: Cloudera Distribution, HDFS, Mapreduce, Pig, Hive, Impala, Yarn, Kafka, Hadoop Cascading, HBase, Cassandra, Spark Streaming, Spark SQL, Scala, Sqoop, Flume, oozie, Jenkins, Zookeeper
IDE s: Eclipse, Microsoft Visual Studio, Micro Focus NET Express 5.0
Build Tools: Maven
Version Control: Git, Git-Flow
Web Server: Jetty
Programming languages: Core Java, Linux shell scripts, COBOL, PL/SQL
Databases: Oracle 11g/10g/9i, DB2, MS-SQL Server, DB2-UDB
Tools: Jira, AutoSys, Quality Center, Rational Clear quest, Microsoft Visual Studio 2010, TFS, Change Management, Data File Tools, Micro Focus NET Express, Shell Scripting - Bash Shell, Korn Shell, C Shell, VI Editor, Putty, Animator, SQL Station, PVCS Dimensions, Toad, Cube-D, Clear Quest, SR Tool, ALM, PPM, Business Intelligence(BO Tools), Micro Focus Net Express 5.1, CALWIN, Hummingbird Exceed, TDMS, Case Copy, PTR, Revision Log, PanValet, Oracle SQL Developer, Clearcase Client (CCRC), Control-M, Beyond Compare, SED - Stream Editor, Win SCP Tool, File Zilla, Developer 2000 - D2K, Oracle Forms, Form Builder, FTP, Novel Netware, Star Team, CA7, Utimatrix, Peregrine, APLTS, SAR, SAP, SAS, TSO, ISPF, Quality Center, UniCenter, PIV, SDSF, Lotus Notes, Microsoft Office- MS Word, MS Excel, MS Visio, MS Outlook etc
Testing: ALM, QTP
Methodologies: Agile, waterfall
WORK EXPERIENCE:
Confidential, Roseville, CA
Big Data Hadoop Developer
Responsibilities:
- Gathered the business requirements from the Business Partners and Subject Matter Experts
- Responsible to manage data coming from different sources
- Supported Map Reduce programs those are running on the cluster
- Real time streaming of data using Spark with Kafka.
- Experienced in implementing Spark RDD transformations, actions to implement business analysis.
- Converting all Hadoop cascading flows into Enterprise data hub using Spark streaming.
- Data ingestion using HTTP Jetty server and loading data into Kafka topic.
- Participated in development/implementation of Cloudera Hadoop environment.
- Load and transform large sets of structured semi structured and unstructured data.
- Responsible to manage data coming from different sources.
- Got good experience with NOSQL database.
- Involved in loading data from UNIX file system to HDFS.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Teradata into HDFS using Sqoop.
- Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
- Worked on Integration Manager of SQOOP import and export data into HDFS and Hive
- Involved in HDFS maintenance and loading of structured and unstructured data
- Experienced in working with Spark eco system using Spark SQL and Scala queries on different data formats like Text file, CSV file.
- Expertized in Implementing Spark using Scala and Spark SQL for faster testing and processing of data.
- Importing and exporting data into HDFS using Sqoop and Kafka.
- Performed joins, group by and other operations in Map Reduce using Java or PIG Latin.
- Creating indexes and tuned the SQL queries in Hive.
- Experience in developing customized UDF’s in Java to extend Hive and Pig Latin functionality while querying and processing of Data.
- Developed script to run night batch process using python.
- Wrote Map Reduce jobs using Java API and Pig Latin as well Hive
- Involved in managing and reviewing Hadoop log files
- Written Hive queries for data analysis to meet the business requirements
Environment:: Hadoop, MapReduce, HDFS, Kafka, Pig, core Java, Cascading, gerrit, Git-Flow, JSON, XML, XSLT, XQUERY, XPATH, Altova XML Spy, Maven, Spark Core, Spark Streaming, Spark SQL, Cloudera Manager, Docker, Pig, Oozie Jenkins, Jira etc.
Confidential, Roseville, CAInformation Associate
Responsibilities:
- Involved in Analysis, Design, Coding and testing extensively
- Interacting with business analysts to understand the about Service requests and Change Requests.
- Involves in Change Requests and Design preparation meetings.
- Gathers and analyzes information regarding requirements and develops or modifies programs to fulfill these needs. Develops program logic and processing steps; codes programs; tracks and evaluates project and systems progress, tests and evaluates alternative solutions, and recommends and implements appropriate applications.
- Create JCL scripts to execute PL/SQL packages
- Development of Micro Focus Cobol (MF-Cobol), NET Express 5.1, UNIX(AIX), Korn Scripting, CICS, Oracle 11g, Putty, VI Editor, Animator, Clear Quest, Toad, PVCS Dimensions Tool, Cube D, Business Objects -Desktop Intelligence, Humming Bird, Micro Focus NET Express 5.1,TDMS,Case copy, SQL Station, File Zilla and Microsoft Office- MS Word, MS Excel, MS Visio etc.
- Performed maintenance of existing Micro Focus COBOL programs and Windows based ACU COBOL for Management Reporting system.
- Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
- Involves in preparation of OLUM Extracts and Reports documentation.
- Wrote PL/SQL stored procedures using Toad.
- Developed new BO Reports and modification of existing reports as per the design using Business Objects XI.
- Conducted Mentoring sessions for new team members as per the Mentoring checklist.
- Hands on experience in performing 24X7 On-Call Production Support for providing rapid solutions to the client problems.
Environment:: Micro Focus Cobol (MF/Cobol), UNIX(AIX), Korn Scripting, Oracle 11g, CICS, Putty, VI Editor, Animator, Clear Quest, Toad, PL/SQL, PVCS Dimensions Tool etc
Confidential, Sacramento, CASenior Developer
Responsibilities:
- Mapping Source fields with target table columns
- Meet the business experts to complete source-to-target data mapping
- Code and test mainframe programs to extract data from source systems
- Conversion of the DB2 database to Oracle
- Create SQL loader scripts to load files to existing tables or design, create and load new Oracle tables
- Conversion from VS COBOL programs into Micro Focus-COBOL, COBOL-DB2 programs to Embedded SQL (Oracle 10g) programs, Online programs (COBOL-DB2-CICS) programs into Batch programs (COBOL-ESQL) programs.
- Convert all CICS programs into Batch programs (ESQL).
- Converted EBCDIC (VSAM & Flat Files) data format into ASCII data format and VSAM files into ISAM files.
Environment:: Micro Focus Cobol (MF-Cobol), JCL, Vsam, Clearcase Remote Client(CCRC), Oracle 10g, SQL Developer, SQL SERVER 2005, VS Cobol, MVS/JCL, VSAM, ISAM, CICS, FTP, Sync Sort, PANVALET, Microsoft Visual Studio 2010.
Confidential, Englewood, COSenior Developer
Responsibilities:
- Respond to Level 1 escalations and ensure timely resolution and escalation to L3 as needed.
- Business Continuity/DR Support
- Support Business Continuity/DR Activities.
- Overall support for EDI (Mapping, Monitoring, and administration).
- Define/Maintain CICS Online tables (Administration including File Control Table, Program Control table, Transaction Control tables, etc).
- Administering the ASG ViewDirect repository including access, specific destination printer, delivery, report definition, report retention, etc.
- Support deployment of changes including upgrades of software to the Western Union environment.
- Initial installation and or configuration of 3rd party software to production environment per installation instructions provided by Rehost team.
- Involvement in password reset requests.
- Troubleshoots Microfocus Enterprise Server administration issues, CA Workload Automation, ASG Document Direct Issues, Infogix Assure Issues and WDI Issues.
- Prepare and Process LDIF files to create user accounts in Microfocus (Active Directory) and to provide access to Microfocus groups.
- Provide User access to CA Workload Automation, ASG View Direct, Infogix and Connect Direct etc.
- Excellent experience in Installation of SSL certificates in Microfocus Environment (TN3270, TOR/AOR, ESMAC and ESA encryptions).
- Troubleshoot issues on recycling the Microfocus Server regions.
Environment:: Micro Focus Enterprise Server administration, CA Workload Automation, ASG View Direct, ASG Document Direct, Infogix and Connect Direct, VSS, SSL Encryption, SQL, DB2 UDB, VS and Microsoft Office- MS Word, MS Excel, MS Visio.
Confidential, Newark, DESenior Developer
Responsibilities:
- Gathered business requirements and worked with business analysts to determine where changes were needed in the system. These changes were done to meet the Federal Health Insurance Portability and Accountability Act (HIPAA) conversion and compliancy requirements.
- Involved in Analysis, Design, Coding and testing extensively.
- Performed coding, testing and debugging for MF/COBOL programs.
- This position involved testing and development of new complex COBOL programs and data.
- Interacting with business users for gathering business requirements to meet the customer demands.
- Daily interaction with the Business Process Owners across the company.
- Involves in preparation of Business Design, Technical design and Walkthroughs.
- Generated reports for both Claims and Financial subsystems. The report generation includes writing SQL queries as per the design specifications, development of Microfocus COBOL Batch code and writing UNIX shell scripts.
- Designing, constructing, Supporting and maintaining software using the skills in Micro Focus-Cobol, UNIX(Sun Solaris), Microfocus Net Express (IDE), SQL, DB2, VI Editor, Putty, Win VI, Win SQL and Win SCP Tool (FTP), MMIS, VCTL etc.
- Create the dat files as Excel files and merge multiple excels into one excel workbook using VB Scripting and placed on network drive.
- Construction of the programs in accordance with the Technical System Design.
- Code reviewing and code walkthroughs of other members of the team.
- Program Analysis for code changes and Testing of coded software component.
- Reviewing the user interfaces developed by various developers provided technical support for repots.
- Work with QA team to get approval of developed application.
- Report weekly project execution status to Client Manager.
- Resolve any issues with execution.
Environment:: Micro Focus-COBOL, JCL, UNIX(Sun Solaris), Microfocus Net Express (IDE), Enterprise Server, Korn Shell, SED, SQL, DB2, Putty, VI Editor, Win SQL, MMIS, Win VI, and Microsoft Office- MS Word, MS Excel, MS Visio and MS Outlook.
Confidential, Franklin, MASenior Developer
Responsibilities:
- Actively involved in analysis, coding, testing, debugging, maintenance of new and existing programs, troubleshooting, creating job procedures, turn over to production, and post implementation support.
- Interacting with business users for gathering business requirements to meet the customer demands.
- Provided technical support for repots loading/accessing/modifying data to DB2 UDB tables through Micro Focus COBOL programs.
- Designing, developing, Supporting and maintaining software using the skills in Micro Focus-Cobol, UNIX, SQL, DB2 UDB, JCL, FTP and Micro Focus Net Express (IDE), Server Express, Enterprise Server etc.
- Work with QA team to get approval of developed application.
- Providing necessary reports for verification.
- Report weekly project execution status to Client Manager.
- Provided technical help to offshore in resolving the problems in Mainframe environment applications.
- Excellent working knowledge with Huge Data files VSAM and their re-organizing processes, easy way of accessing methods.
- Provided technical support for repots loading/accessing/modifying data to DB2 tables through Mainframe programs.
- Designing, Developing and Maintaining Restitution applications using Microfocus COBOL.
- Responsible for maintaining/cleaning Data base (ISAM data files/DB2 tables) on request basis.
- Responsible for analyzing the Business process by collecting the metrics, documenting the findings, provide possible solutions to improve the processes.
- All work was done in very fast paced and result oriented environment.
- Responsible to interact with the END USER to guide/resolve the issues with the Business applications.
Environment: Micro Focus COBOL, UNIX(RED HAT LINUX), Korn Shell Scripting, JCL, VSAM, DB2 UDB, CICS, Putty, VI Editor, SQL, MFEEE-Microfocus Server Express, Enterprise Server, NET Express 5.1, Animator, IBM-DB2, Lotus Notes, File Zilla, FTP and Microsoft Office- MS Word, MS Excel, MS Access, MS Visio.
ConfidentialSoftware Engineer
Responsibilities:
- Program Analysis for code changes.
- Preparation of requirement Specification, technical specification document (TSD)
- Construction of the program in accordance with the TSD.
- Testing of coded software component.
- Document the application details.
- Drafting test scenarios, test scripts to get the User Acceptance for the software solutions provided by onsite/ offshore team.
- Automation of extensively person-dependent processes like Purge process of different products for reuse.
- Preparation of Control plans for all applications with process and system information.
- Merging applications to reduce base maintenance costs.
- Report generation for various application and system users
Environment:: VS-COBOL-II, MVS/JCL, IBM UTILITIES, DB2, CICS, SQL, ENDEVOR, Peregrine, Boulevard, Endeavor, APLTS, Peregrine, SAR, Expeditor, Ultimatrix, FTP tool and Microsoft Office - MS Word, MS Excel etc.