We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Roseville, CA

SUMMARY

  • 10+ years of overall experience in Enterprise Application Development in diverse industries which includes hands on experience in Big data ecosystem related technologies.
  • Having 2 years of hands on experience working with Hadoop, HDFS, Map Reduce framework and Hadoop ecosystem like Hive, HBase, Sqoop and Oozie.
  • Experience in working with MapReduce programs using Apache Hadoop for working with Big Data.
  • Experience in using Pig, Hive, Impala, Sqoop, HBase and Cloudera Manager.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
  • Hands on experience in application development using core Java, RDBMS, and Linux shell scripting
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java.
  • Familiar with Java virtual machine (JVM) and multi-threaded processing.
  • Worked on NoSQL databases including HBase and Cassandra.
  • Knowledge in job workflow scheduling and monitoring tools like oozie and Zookeeper
  • Experience in designing, developing and implementing connectivity products dat allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
  • Good understanding of XML methodologies (XML, XPATH, XQUERY,XSLT) including Web Services REST and SOAP.
  • Techno-functional responsibilities include interfacing with users, identifying functional and technical gaps, estimates, designing custom solutions, development, leading developers, producing documentation, and production support.
  • Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.
  • Experience in managing and reviewing Hadoop log files.
  • Very good experience in complete project life cycle design, development, testing and implementation of Client Server and Web applications.

TECHNICAL SKILLS

Hadoop/Big Data: Cloudera Distribution, HDFS, Mapreduce, Pig, Hive, Impala, Yarn, Kafka, Hadoop Cascading, HBase, Cassandra, Spark Streaming, Spark SQL, Scala, Sqoop, Flume, oozie, Jenkins, Zookeeper

IDE’s: Eclipse, Microsoft Visual Studio, Micro Focus NET Express 5.0

Build Tools: Maven

Version Control: Git, Git-Flow

Web Server: Jetty

Programming languages: Core Java, Linux shell scripts, COBOL,PL/SQL

Databases: Oracle 11g/10g/9i, DB2, MS-SQL Server, DB2-UDB, Marklogic

Tools: Microsoft Visual Studio 2010, TFS, Change Management, Data File Tools, Micro Focus NET Express, Shell Scripting - Bash Shell, Korn Shell, C Shell, VI Editor, Putty, Animator, SQL Station, PVCS Dimensions, Toad, Cube-D, Clear Quest, SR Tool, ALM, PPM, Business Intelligence(BO Tools), Micro Focus Net Express 5.1, CALWIN, Hummingbird Exceed TDMS, Case Copy, PTR, Revision Log, PanValet, Oracle SQL Developer, CE2000, Clearcase Client (CCRC), MATT Tool and FTF, Control-M, Beyond Compare, SED - Stream Editor, VCTL - Versioning Control, Ultra Edit, OTSORT, Win SQL, Win VI, MMIS, Win SCP Tool, File Zilla, Developer 2000 - D2K, Oracle Forms, Form Builder, FTP, Novel Netware, Star Team, Citrix, Etracker, CA7, MTP, Boulevard, Utimatrix, Peregrine, APLTS, SAR, SAP, SAS, TSO, ISPF, Quality Center, UniCenter, PIV, SDSF, Lotus Notes, Microsoft Office- MS Word, MS Excel, MS Visio, MS Outlook etc

Testing: ALM, QTP

Methodologies: Agile, waterfall

PROFESSIONAL EXPERIENCE

Confidential

Hadoop Developer

Responsibilities:

  • Real time streaming of data usingSparkwith Kafka.
  • Experienced in implementingSparkRDD transformations, actions to implement business analysis.
  • Converting all Hadoop cascading flows into Enterprise data hub using Spark streaming.
  • Data ingestion using HTTP Jetty server and loading data into Kafka topic.
  • Participated in development/implementation of Cloudera Hadoop environment.
  • Load and transform large sets of structured semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Got good experience withNOSQLdatabase.
  • Involved in loading data fromUNIXfile system toHDFS.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Teradata into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.

Environment: Hadoop, MapReduce, HDFS,Kafka, Pig, core Java, Cascading, gerrit, Git-Flow, JSON, XML, XSLT, XQUERY, XPATH, Altova XML Spy, Maven, Spark Core, Spark Streaming, Spark SQL, Cloudera Manager, Docker, Pig, Oozie Jenkins, Jira etc.

Confidential, Roseville, CA

Information Associate

Responsibilities:

  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries dat helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Provided design recommendations and thought leadership to sponsors/stakeholders dat improved review processes and resolved technical problems.
  • Documented the systems processes and procedures for future references.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Monitored multiple Hadoop clusters environments using Ganglia.
  • Monitored workload, job performance and capacity planning using Cloud era Manager.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Performed Map Reduce programs on log data to transform into structured way to find user location, age group, spending time.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports by our BI team.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Create PL/SQL packages to update Oracle tables
  • Create JCL scripts to execute PL/SQL packages
  • Development of Micro Focus Cobol (MF-Cobol), NET Express 5.1, UNIX(AIX), Korn Scripting, CICS, Oracle 11g, Putty, VI Editor, Animator, Clear Quest, Toad, PVCS Dimensions Tool, Cube D, Business Objects -Desktop Intelligence, Humming Bird, Micro Focus NET Express 5.1,TDMS,Case copy, SQL Station, File Zilla and Microsoft Office- MS Word, MS Excel, MS Visio etc.
  • Performed maintenance of existing Micro Focus COBOL programs and Windows based ACU COBOL for Confidential system.
  • Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
  • Involves in preparation of OLUM Extracts and Reports documentation.
  • Wrote PL/SQL stored procedures using Toad.
  • Developed new BO Reports and modification of existing reports as per the design using Business Objects XI.
  • Conducted Mentoring sessions for new team members as per the Mentoring checklist.
  • Hands on experience in performing 24X7 On-Call Production Support for providing rapid solutions to the client problems.

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Pig, Sqoop, Oozie, Micro Focus Cobol (MF/Cobol), UNIX(AIX), Korn Scripting, Oracle 11g, CICS, Putty, VI Editor, Animator, Clear Quest, Toad, PL/SQL, PVCS Dimensions Tool etc

Confidential, Sacramento, CA

Senior Developer

Responsibilities:

  • Mapping Source fields with target table columns
  • Meet the business experts to complete source-to-target data mapping
  • Code and test mainframe programs to extract data from source systems
  • Conversion of the DB2 database to Oracle
  • Create SQL loader scripts to load files to existing tables or design, create and load new Oracle tables
  • Conversion from VS COBOL programs into Micro Focus-COBOL, COBOL-DB2 programs to Embedded SQL (Oracle 10g) programs, Online programs (COBOL-DB2-CICS) programs into Batch programs (COBOL-ESQL) programs.
  • Convert all CICS programs into Batch programs (ESQL).
  • Converted EBCDIC (VSAM & Flat Files) data format into ASCII data format and VSAM files into ISAM files.

Environment: Micro Focus Cobol (MF-Cobol), JCL, Vsam, Clearcase Remote Client(CCRC), Oracle 10g, SQL Developer, SQL SERVER 2005, VS Cobol, MVS/JCL, VSAM, ISAM, CICS, FTP, Sync Sort, PANVALET, Microsoft Visual Studio 2010.

Confidential, Englewood, CO

Senior Developer

Responsibilities:

  • Respond to Level 1 escalations and ensure timely resolution and escalation to L3 as needed.
  • Business Continuity/DR Support
  • SupportBusiness Continuity/DR Activities.
  • Overall support for EDI (Mapping, Monitoring, and administration).
  • Define/Maintain CICS Online tables (Administration including File Control Table, Program Control table, Transaction Control tables, etc).
  • Administering the ASG ViewDirect repository including access, specific destination printer, delivery, report definition, report retention, etc.
  • Support deployment of changes including upgrades of software to the Confidential environment.
  • Initial installation and or configuration of 3rd party software to production environment per installation instructions provided by Rehost team.
  • Involvement in password reset requests.
  • Troubleshoots Microfocus Enterprise Server administration issues, CA Workload Automation, ASG Document Direct Issues, Infogix Assure Issues and WDI Issues.
  • Prepare and Process LDIF files to create user accounts in Microfocus (Active Directory) and to provide access to Microfocus groups.
  • Provide User access to CA Workload Automation, ASG View Direct, Infogix and Connect Direct etc.
  • Excellent experience in Installation of SSL certificates in Microfocus Environment (TN3270, TOR/AOR, ESMAC and ESA encryptions).
  • Troubleshoot issues on recycling the Microfocus Server regions.

Environment: Micro Focus Enterprise Server administration, CA Workload Automation, ASG View Direct, ASG Document Direct, Infogix and Connect Direct, VSS, SSL Encryption, SQL, DB2 UDB, VS and Microsoft Office- MS Word, MS Excel, MS Visio.

Confidential, Newark, DE

Senior Developer

Responsibilities:

  • Gatheird business requirements and worked with business analysts to determine where changes were needed in the system. These changes were done to meet the Federal Health Insurance Portability and Accountability Act (HIPAA) conversion and compliancy requirements.
  • Involved in Analysis, Design, Coding and testing extensively.
  • Performed coding, testing and debugging for MF/COBOL programs.
  • dis position involved testing and development of new complex COBOL programs and data.
  • Interacting with business users for gathering business requirements to meet the customer demands.
  • Daily interaction with the Business Process Owners across the company.
  • Involves in preparation of Business Design, Technical design and Walkthroughs.
  • Generated reports for both Claims and Financial subsystems. The report generation includes writing SQL queries as per the design specifications, development of Microfocus COBOL Batch code and writing UNIX shell scripts.
  • Designing, constructing, Supporting and maintaining software using the skills in Micro Focus-Cobol, UNIX(Sun Solaris), Microfocus Net Express (IDE), SQL, DB2, VI Editor, Putty, Win VI, Win SQL and Win SCP Tool (FTP), MMIS, VCTL etc.
  • Create the dat files as Excel files and merge multiple excels into one excel workbook using VB Scripting and placed on network drive.
  • Construction of the programs in accordance with the Technical System Design.
  • Code reviewing and code walkthroughs of other members of the team.
  • Program Analysis for code changes and Testing of coded software component.
  • Reviewing the user interfaces developed by various developers provided technical support for repots.
  • Work with QA team to get approval of developed application.
  • Report weekly project execution status to Client Manager.
  • Resolve any issues with execution.

Environment: Micro Focus-COBOL, JCL, UNIX(Sun Solaris), Microfocus Net Express (IDE), Enterprise Server, Korn Shell, SED, SQL, DB2, Putty, VI Editor, Win SQL, MMIS, Win VI, and Microsoft Office- MS Word, MS Excel, MS Visio and MS Outlook.

Confidential, Franklin, MA

Senior Developer

Responsibilities:

  • Actively involved in analysis, coding, testing, debugging, maintenance of new and existing programs, troubleshooting, creating job procedures, turn over to production, and post implementation support.
  • Interacting with business users for gathering business requirements to meet the customer demands.
  • Provided technical support for repots loading/accessing/modifying data to DB2 UDB tables through Micro Focus COBOL programs.
  • Designing, developing, Supporting and maintaining software using the skills in Micro Focus-Cobol, UNIX, SQL, DB2 UDB, JCL, FTP and Micro Focus Net Express (IDE), Server Express, Enterprise Server etc.
  • Work with QA team to get approval of developed application.
  • Providing necessary reports for verification.
  • Report weekly project execution status to Client Manager.
  • Provided technical help to offshore in resolving the problems in Mainframe environment applications.
  • Excellent working knowledge with Huge Data files VSAM and their re-organizing processes, easy way of accessing methods.
  • Provided technical support for repots loading/accessing/modifying data to DB2 tables through Mainframe programs.
  • Designing, Developing and Maintaining Restitution applications using Microfocus COBOL.
  • Responsible for maintaining/cleaning Data base (ISAM data files/DB2 tables) on request basis.
  • Responsible for analyzing the Business process by collecting the metrics, documenting the findings, provide possible solutions to improve the processes.
  • All work was done in very fast paced and result oriented environment.
  • Responsible to interact with the END USER to guide/resolve the issues with the Business applications.

Environment: Micro Focus COBOL, UNIX(RED HAT LINUX), Korn Shell Scripting, JCL, VSAM, DB2 UDB, CICS, Putty, VI Editor, SQL, MFEEE-Microfocus Server Express, Enterprise Server, NET Express 5.1, Animator, IBM-DB2, Lotus Notes, File Zilla, FTP and Microsoft Office- MS Word, MS Excel, MS Access, MS Visio.

Confidential

Software Engineer

Responsibilities:

  • Program Analysis for code changes.
  • Preparation of requirement Specification, technical specification document (TSD)
  • Construction of the program in accordance with the TSD.
  • Testing of coded software component.
  • Document the application details.
  • Drafting test scenarios, test scripts to get the User Acceptance for the software solutions provided by onsite/ offshore team.
  • Automation of extensively person-dependent processes like Purge process of different products for reuse.
  • Preparation of Control plans for all applications with process and system information.
  • Merging applications to reduce base maintenance costs.
  • Report generation for various application and system users

Environment: VS-COBOL-II, MVS/JCL, IBM UTILITIES, DB2, CICS, SQL, ENDEVOR, Peregrine, Boulevard, Endeavor, APLTS, Peregrine, SAR, Expeditor, Ultimatrix, FTP tool and Microsoft Office - MS Word, MS Excel etc.

We'd love your feedback!