We provide IT Staff Augmentation Services!

Hadoop Developer/administrator Resume

0/5 (Submit Your Rating)

Sunnyvale, CaliforniA

SUMMARY

  • Around 8 years of IT experience with development, maintenance and production support projects on various platforms like Hadoop, Pega and Mainframes.
  • Around 3+ years of experience on Hadoop, HDFS, Hive, Pig, Sqoop, Map Reduce, Flume, Oozie, H - base Pentaho and Map Reduce Programming.
  • Working experience on Pentaho data integration (PDI), which is a graphical design environment for building a solution for extract, transform, and load (ETL) jobs.
  • Hands on experience in installing, configuring, administrating and working withHadoop ecosystems of major Hadoop distributions like Cloudera and good noledge on Hortonworks distribution management.
  • Good hands on experience on InfoSphere Big Insights(Pig, HBase, Hive and Big SQL)
  • Having approximately 2 years of Linux administration experience.
  • Cloudera Certified Developer for Apache Hadoop(CCDH) CCD-410.
  • Exposure to different flavors of Linux (Ubentu, CentOS, Red Hat and SUSE).
  • Good exposure to Echosystems like Pig, Hive, Sqoop, MapReduce, Flume, Ambari, Zookeeper, Oozie, H-base and Pentaho.
  • Working experience on R statistical computing for data analysis.
  • Capturing data from existing databases that provide SQL interfaces using Sqoop.
  • Worked on scheduling Hadoop jobs using Apache Oozie.
  • Experience in Relational Databases (RDBMS), SQL & No-SQL(not only SQL) databases.
  • Used Ambari for monitoring and managing Hadoop and HBase clusters.
  • Efficient in building Hive, pig and Mapreduce Scripts.
  • Good noledge on SQL-H and schema less SQL.
  • Around 1.5 years of experience on Pega using Process flow, User Interface, Activities, Class structure etc.
  • Hands on experience using Tracer and Clipboard.
  • Certified System Architect for PRPC with Score over 80%.
  • Strong experience on COBOL, JCL, DB2, CICS, VSAM and Eazytrieve development from scratch
  • Involved in preparing design document and Architecture design document (ADD).
  • Around 3.5 years of experience with good noledge on agile methodology.
  • Worked Retail, Wholesale, Finance and Insurance domains.
  • Good exposure to requirement analysis, design, development, UAT, application maintenance and support.
  • Basic noledge in application design using Unified Modeling Language (UML), Sequence diagrams, Case diagrams, Entity Relationship Diagrams (ERD) and Data Flow Diagrams (DFD).
  • Comprehensive noledge of Software Development Life Cycle coupled with excellent communication skills
  • Enthusiastic and eager to take initiative and responsibility in any given task
  • Strong technical and interpersonal skills combined with great commitment towards meeting deadlines
  • Experience working in both team and individual environments. Always eager to learn new technologies and implement them in challenging environments

TECHNICAL SKILLS

Programming Languages: C, Java (core java), R programming, COBOL, JCL, EazytriveCICS, REXX and PLI

Operating Systems: Windows, Linux (CentOS, Ubentu, SUSE), UNIX, VM/CMS, Z/OS, MVS

Big Data: HDFS, Map Reduce

Echo Systems: PIG, Hive, Flume, HBase, Zookeeper, Sqoop, Ambari and Pentaho

Databases: VSAM, MS-Access, DB2 and MySQL,NoSQL and Big SQL.

Development IDEs: MyEclipse7.0

PROFESSIONAL EXPERIENCE

Confidential, Sunnyvale, California

Hadoop Developer/Administrator

Responsibilities:

  • Gatheird the business requirements from the Business Partners and Subject Matter Experts.
  • Involved in installing Hadoop Ecosystem components.
  • Managing and reviewing the Hadoop log files.
  • Responsible to manage and load data into Hadoop cluster coming from different sources.
  • Supported Map Reduce Programs those are running on the cluster.
  • Involved in HDFS maintenance and loading of structured and unstructured data.
  • DevelopedMap Reduce process using Java API.
  • Installed and configured Pig and also developed Pig Latin scripts.
  • Monitoring and managing Hadoop cluster using Ambari.
  • Monitored and imported preexisting Oozie work flow for pig and Hadoop jobs.
  • Used HBase for random access of data from Hbase database (column oriented database).
  • Used PDI (Pentaho Data Integration) to extract, transform and load (ETL) using metadata driven approach.
  • Used Pentaho to migrate data between different databases and applications.
  • Used Pentaho for execution and scheduling of ETL jobs.
  • Developed R programs for reports.
  • Imported data frequentlyfrom MySQL to HDFS using Sqoop.
  • Developed Scripts and Batch Job to schedule various Hadoop Program.
  • Wrote Hive queries for data analysis to meet the business requirements.
  • Created Hive tables and working on them using Hive QL.

Environment: Hadoop, HDFS, Map Reduce, Pig, Hive, H-base, Sqoop, Pentaho, Oozie, Ambari, Flume, Zookeeper, R programming (for reporting) and Linux Ubentu operating system

Confidential, Minneapolis, Minnesota

Hadoop Developer/Administrator

Responsibilities:

  • Setting up Hadoop cluster and Hadoop eco system environment for different big data analytics project in data centers
  • Pro-active monitoring and support for Hadoop cluster and deployment of application from development environment to production servers or Amazon Cloud
  • Monitored and managed the
  • Developed pig scripts for data analysis
  • Benchmarking Hadoop cluster for optimum performance
  • Deployed and evaluated various open source applications and given support for different open source projects
  • Assist the team for any technical issues or hiccup’s.
  • Organize weekly team meeting to get update from offshore and onsite work status.
  • Monthly calls with client to update the status of critical tasks handled by team and get discuss the scoop of work that can be handled by team for next month.
  • Update team weekly status to ADM and project manager.
  • Perform the role of quality lead for the team.
  • Perform monthly and quarterly audit.
  • Perform Secondary control audit for the implementations done for previous month.
  • Perform separation of duties (SOD) audit.
  • Create work for the team in rational team concert (RTC) tool.
  • Close the tickets or follow up with the team to close the tickets in time on Remedy tool.
  • Create tasks for team in Clarity tool for team to claim hours for project which are in sync with ILC.
  • Work with IOC and on call teams when Sev1 or 2 tickets are resolved.
  • Upload the audit documents in shared/team folder.
  • Create MOM (Minutes of Meeting) for the meetings and follow up on the required action items.
  • Co-ordinate with users who perform testing and get approval for the change to be implemented.
  • Work with Business Objects (BO) and Mission Mode tools.
  • Involved in code reviews for team members task and unit testing.
  • Implemented pilot project using Infosphere big insights on SUSE Linux operating system.
  • Good hands on experience on InfoSphere Big Insights(Pig, HBase, Hive and Big SQL)

Environment: Hadoop, HDFS, Map Reduce, Pig, Hive, Flume, H-base, Java, Passport, ServConn, COBOL, JCL and Red Hat Linux operating system (Admin), SUSE Linux, InfoSphere Big Insights(Pig, HBase, Hive and Big SQL)

Confidential

Team Lead, Developer, Tester and Production support analyst

Responsibilities:

  • Analysis of High-level Requirement/RFS (request for service).
  • Involved in business and technical use case review.
  • Involved in low level design, coding, enhancements and unit testing.
  • Involved in code reviews for team members task and unit testing.
  • Working as quality coordinator for the team.
  • Prepared UDD (Understanding design document).
  • Prepared TSD (Technical specification Document), and IP (Implementation Plan).
  • Updating AID document (application information document) if required any changes.
  • Attending WSR calls (weekly status report) with client.
  • Creating MoM (Minutes of Meeting) for WSR calls (weekly status report) and weekly team meeting.
  • Experience in working with CICS screens
  • Production Support analysis and fixing abends
  • Monitoring the batch and verifying the files sent to the down systems.
  • Updating ITSM with proper resolution to tickets.
  • Creating Production Log and circulating across the team.
  • Unit Test Cases creation.
  • Unit Testing.
  • Quality check for code and Test Results.
  • Individually handled the Unit testing, System testing and User acceptance testing of both online and batch components/programs.
  • Preparation of Test results document.
  • Conducting self-reviews and Peer reviews.

Environment: Pega PRPC V5.5, Java (Strong core java skills), Mainframe, COBOL, JECL, VSAM, REXX, EAZYTRIVE in VM/CMS operating system

We'd love your feedback!