Hadoop Developer/lead Resume
Salt Lake City, UT
SUMMARY
- 8.6 years of IT industry experience with over 2.5 years of experience in Hadoop Eco systems.
- Extensive knowledge and experience in Big Data with Map - Reduce, HDFS, Hive, Pig, Oozie and Sqoop.
- Good knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, and MapReduce(MRV1 and understanding of YARN) concepts.
- Good experience in working with MapReduce programs using Apache Hadoop for working with Big Data to analyze large data sets efficiently.
- Developed UDFs for Hive using Java.
- Designed automated process for mapping metadata lineage for Enterprise wide Data Governance initiative using Metadata Hub and Ab-initio.
- Strong understanding of NoSQL databases like HBase.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Strong experience in data analysis with exclusive experience in Ab-Initio (GDE, Co>operating system).
- Strong knowledge of all phases of Software Development Life Cycle (SDLC), and experience in Waterfall, Agile/Scrum development Methodologies.
- Strong ability in communicating with Various Teams and ability to work on multiple projects and prioritize workload.
- Good domain knowledge of the Banking & Financial services specially in Cards.
- UNIX shell scripting experience to automate the batch application.
- Hands-On experience with R Programming for obtaining and cleaning data for analytical purpose
- Expertise in manual testing of ETL tool Ab-Initio applications.
- Involved in High Level Design and Detailed Level Design Documents for maintenance and enhancement of Check processing applications.
- Worked as onshore counterpart and worked with appropriate stakeholders to optimize available resource utilization and improved quality of deliverables by devising new processes and utilities.
TECHNICAL SKILLS
Hadoop/BIG Data: HDFS, HBase, Hive, Sqoop, Oozie, Flume, Pig and MapReduce.
BI tools: Ab Initio
Data Governance: MetaData Hub
No SQL Databases: HBase
Database: Oracle 9i/10g/11g, DB2, SQL Server, MySQL
Version Control: SVN
Operation System: Mainframe Z/OS, HP-UNIX, Ubuntu Linux and Windows XP/Vista/7/8
Scheduling tools: Autosys, Trivoli, Control-M, CA7.
Languages: SQL, C, Core Java, AWK, Shell Scripting.
Host: Cobol, JCL, GDG file, VSAM, Flat Files.
PROFESSIONAL EXPERIENCE
Confidential, Salt Lake City, UT
Hadoop Developer/Lead
Responsibilities:- Gathering business requirements from the Business Partners and users.
- Importing the data from the Oracle to HIVE and HDFS using SQOOP for one time and daily solution.
- Worked on Big Data Hadoop environment on multiple Nodes.
- Developed MapReduce programs and Pig scripts for aggregating the daily eligible & qualified transactions details and storing the details in HDFS and HIVE
- Worked on integrating HIVE and Tableau dashboard reporting to facilitate business users the ability to interact with Hive tables
- Worked on exporting the same to Oracle DB, which further will be used for generating business reports.
- Worked in tuning HiveQL and Pig scripts to improve performance.
- Worked on troubleshooting performance issues and tuning Hadoop cluster.
- Created Pig UDFs to improve reusability of code and modularizing the code.
- , very good understanding of Partitions, Bucketing concepts in HiveQL and designed both Managed and External tables in HiveQL for optimized performance
Environment: Hadoop, MapReduce, HDFS, Hive, Hbase, Java, Oracle, Cloudera Manager, Pig, Sqoop, Oozie, Tableau
Confidential, Salt Lake City, UT
ETL Developer/Lead
Responsibilities:
- Engages with Client to understand their plans and objectives and assesses team’s ability to address these needs
- Provides code/design analysis and strategy
- Developing code using knowledge of relevant technology as per design specifications
- Mentoring relatively new resources in the project to understand the system and deliver the assigned tasks.
- Reviewing Test Cases prepared for Unit testing, System Testing, Regression Testing and End-to-End Testing.
- Preparing test data for System Testing, Regression Testing.
Environment: Ab Initio GDE, and 3.0 Co-Operating System, Oracle, Linux, Control-M, UNIX Shell Scripting
Confidential, Salt Lake City, UT
Application Lead (ETL & Mainframe)
Responsibilities:
- Leverage the subject matter expertise to work cohesively with business users to understand the technical problems faced and provide them round the clock production support to aid in quicker resolution of issues within the platform and other interfacing systems including third party vendors, if required.
- Provide Design and Implementation guidance to various technology partners
- Develop code to enhance the application as per new business requirements
- Provide problem and incident management solutions to business users within agreed upon target times and service level agreements with the client
- Preparing low level designs of the system along with build strategy.
- Building various components i.e. DB2, SQL, UNIX, Ab-Initio.
- Lead the technical delivery and co-ordinate with other teams for ensuring application and business goals are successfully met
Environment: Ab Initio GDE and 3.0 Co-Operating System, DB2, Linux, Control-M, UNIX Shell Scripting, JCL, Z/OS, Cognos Web reporting
Confidential
System Engineer (Mainframe & ETL)
Responsibilities:
- Providing 24x7 supports to business users.
- Monitoring batch jobs.
- Solving production job failures and batch abends.
- Communicating with Front office and business users to resolve the business queries.
- Look into complex errors and production abends and makes changes in production server (Break fix).
Environment: Ab-Initio, COBOL, JCL, DB2, File-Aid, TSO/ISPF, CHANGEMAN, SPUFI, NDM,, IBM Mainframe Utilities, Cognos Web reporting