We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

ChicagO

SUMMARY

  • 8+ years of IT experience which includes 3 years of experience in Bigdata that involves analysis, design, coding, testing and implementation of Hadoop components like Hadoop Framework, Map Reduce Programming, Pig, Hive, HBASE, Cassandra, Flume, Sqoop,YARN, IMPALA.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Hands on experience in installing, configuring, and using Apache Hadoop ecosystem components like Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, and Flume.
  • Extending Hive and Pig core functionality by writing customUDFs.
  • Experience in working with large scale Hadoop environments build and support including design, configuration, installation, performance tuning and monitoring.
  • Experience in managing and troubleshooting Hadoop related issues.
  • Knowledge in job/workflow scheduling and monitoring tools like Oozie & Zookeeper.
  • Experience in analyzing data using HIVEQL, PIG Latin and custom Map Reduce programs in JAVA.
  • Passionate about Hadoop and Big Data, Horton Works, Cloudera technology.
  • Expertise in creating Hive Internal/External tables, Hive's analytical functions views and writing scripts in HiveQL .
  • Experience in NoSQL database HBase.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Performed data analysis using Hive and Pig.
  • Loaded streaming log data from various web servers into HDFS using Flume.
  • Experience in importing and exporting terabytes of data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Successfully loaded files to Hive and HDFS from Oracle and SQL Server using SQOOP.
  • Creating Hive tables to import large data sets from various relational databases using Sqoop and export the analyzed data back for visualization and report generation by the BI team.
  • Document and explain implemented processes and configurations in upgrades.
  • Support development, testing, and operations teams during new system deployments.
  • Excellent knowledge of EDI tools like Foresight’s EDISIM
  • Experience in using Web sphere MQ objects.
  • Basic knowledge of using IBM WMB 6.5 and 7.1 and work flows and nodes.
  • Experience in using database applications of RDBMS in ORACLE 8i,DB2 and MS Access, SQL Server.
  • Very good understanding of EDI/XML/Rosetta Net implementation guidelines
  • Well experienced in testing huge and complex databases, Reporting and ETL tools like Informatica and Data Stage.
  • Well experienced in testing data loads, data transformation and data quality.
  • Good communication and team player Skills, ability to analyze and problem solving skills. Best fit to work with group as well as individual.

TECHNICAL SKILLS

Hadoop Ecosystem: Pig, Hive, MapReduce, Flume,oozie, HBase, HDFS, Zookeeper, Hue,YARN, IMPALA

Mapping Tools: Mercator Design Studio 8.0, 7.5.1, 6.7, 6.5, 5.0, Integration Flow Designer, Database Interface Designer, Commerce Manager

Operating Systems: AS/400, Windows XP, Windows 9x/2000, Windows NT 4.0, UNIX, MS-DOS

RDBMS: Oracle, SQL Server, Relational DBMS, MS-Access, Toad

NoSQL: Hbase, Cassandra

Internet Technologies: ecommerce, HTML, XML, JSON

Tools: /Packages: MS-Office, Visual SourceSafe (VSS), GIT,VCS, Visual Interdev, MS-Visio, Ultra Edit, EDISIM, XML Spy, EDIFECS, RAMP MANAGER, HP SCM.

Programming Languages: C, C#,VB.net, Core java, python.

PROFESSIONAL EXPERIENCE

Confidential, Chicago

Hadoop Developer

Responsibilities:

  • Analyzed the scope of the project.
  • Involved in review of functional and non-functional requirements.
  • Participate in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meetings with various business usersand involved in building scalable distributed data solutions for Hadoop.
  • Involved in setting up 10 NODE Hadoop cluster for initial POC project.
  • Handled data coming from different data sources and loaded from UNIX file system into HDFS.
  • Loaded and transformed large sets of structured, semi structured and unstructured data.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Exported the resulted sentiment analysis data to Tableau for creating dashboards
  • Worked on installing cluster, commissioning & decommissioning of data node, name node recovery, capacity planning, and slots configuration.
  • Experienced in managing and reviewing Terabytes of log files with Hadoop streaming jobs.
  • Worked on Importing and exporting data into HDFS and Hive using Sqoop .
  • Involved in moving all log files generated from various sources to HDFS for further processing through PIG.
  • Involved in creating Hive internal and external tables, loading with data and writing hive scripts which will run internally in map reduce way.
  • Experience in developing PigLatin and HiveQL scripts for Data Analysis and ETL purposes .
  • Created Hive internal and external Tables, Partitions, Bucket for further Analysis using Hive joins.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Managing and scheduling Jobs on a Hadoop cluster using Oozie work flows and Oozie Coordinator engine.
  • Continuous monitoring and managing the Hadoop cluster using Cloudera Manager.
  • Performed unit testing for all the components using JUnit

Environment: Hadoop 0.20, HBase, JAVA MapReduce, Pig, Hive, Hue, HDFS, Sqoop, HBase, MySQL, Oozie, Java (jdk1.6), Oracle 10g, CDH4, Hadoop Cloudera CDH3/4.

Confidential, NY

WTX Developer

Responsibilities:

  • Analyzed the scope of the project.
  • Interacted with the business team to understand the requirements.
  • Completed understanding the EDI SWIFT Transactions.
  • Developed Type Trees from Database views and xml Schemas.
  • Upgraded the components from WTX 6.7, 7.5 to WTX 8.0/8.2.
  • Created Test Cases and Test data for different SWIFT Transactions.
  • Worked with Websphere MQ objects for routing data to different queues.

Environment: WTX 6.7.1/7.5/8.0/8.2 , PL/SQL, FTP (Cute FTP), EDI SWIFT, Ultra Edit, Windows 2000, DB2,Websphere MQ 6.1.

Confidential, DURHAM, NC

EDI Analyst

Responsibilities:

  • Analyzed the scope of the project.
  • Involved in preparing requirements with business users.
  • Interacted with the business and USER ACCEPTANCE TESTIING team to understand the requirement.
  • Prepared Professional and Institutional X12 Test files for the Scenarios given by Business team.
  • Prepared files for the Scenarios given by Business team.
  • Complete understanding of the X12 transaction sets.
  • Prepared test scenarios and documented the results.
  • Analyzed all the test files those were developed manually using Ramp Manager and EDIFECS Analyzer.

Environment: WTX 8.2, Oracle 11i, Microsoft Visual Source Safe 2008,PL/SQL, FTP (CuteFTP), EDI ANSI X12 V4010/V5010, Ramp Manager, Edifecs Analyzer.

Confidential, Honolulu, Hawaii

WTX Developer

Responsibilities:

  • Developed design documents for Renovation Project and 27x Batch Renovation projects.
  • Developed mapping documents for Database loads, NPI cross walk and Audit maps.
  • Developed Audit Balancing map to make sure the flow of the claims is proper in the WTX system for 837 claims.
  • Created Test cases and test data for 837I/P and Medicare claims for 5010.
  • Involved in end to end string testing for 837I/P and Medicare claims for 5010.
  • Involved in the team for developing a tool for converting 4010 claims to 5010 with the help of Implementation guides
  • Developed technical documents for all the maps in the WTX system for 837 claims for 5010.
  • Developed the flow diagram of the 5010 claims processing in the WTX system.
  • Developed Perl scripts to move files after Instream validation (Compliance checker)
  • Developed type trees from the views in the databases for the database load maps which loads all the data from x12 files to Oracle Database.
  • Developed type trees and maps which load the 837 Claims data into Oracle database.
  • Involved in the performance tuning of the Database load maps for time-efficient processing of the claims.
  • Prepared test cases to test Instream Custom Edit codes and HIPAA Edit codes.
  • Played major role in 24/7 production support for transactions like 837I/P, 27X(V4010)
  • Upgraded components to WTX 8.2 from WTX 8.0/DSTX 7.5.
  • Prepared test cases for BizTalk team for 5010 Renovation Project.
  • Used web services to convert 5-digit zip code to 9-digit zip code in NPI crosswalk.

Environment: WTX 8.2, Web Services, Oracle 11i, Microsoft Visual Source Safe 2008,PL/SQL, FTP (Cute FTP), EDI ANSI X12 V4010/V5010, Perl, UNIX, Ultra Edit, Windows 2000,Oracle 10g,Websphere MQ.

We'd love your feedback!