Hadoop Consultant Resume
2.00/5 (Submit Your Rating)
Austin, TX
SUMMARY:
- Seeking a new opportunity that will effectively challenge and utilize my problem solving, analytical and innovative capabilities across a broad range of projects.
- To be a resourceful, innovative and result oriented employee, through personal development with a view to achieving both
- Corporate and personal growth while enjoying job satisfaction.
WORK EXPERIENCE:
Confidential, Austin, TX
Hadoop Consultant
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop
- Responsible for continuous monitoring the Hadoop cluster through Cloudera Manager
- Worked extensively with Sqoop for importing metadata from Oracle and migrating the ETL jobs into HDFS.
- Responsible to understand current Data Sources with deep dive in to analyze the structure of the DS.
- Identify which data needs to be pulled in to HDFS.
- Data Ingested in to HDFS before it is transformed and loaded into target systems using Hive, MapReduce and SQOOP.
- Transformations accomplished with MapReduce, HIVE and PIG
- Configured Sqoop and developed scripts to extract data from MYSQL into HDFS.
- Created Hive queries for performing data analysis and improving performance using tuning parameters.
- Analyzed the data by performing Hive queries and running Pig scripts
- Responsible for Sqoop jobs that load data from Data Sources to HDFS
- Responsible for developing PIG Latin scripts.
- Responsible for moving Transformed data in to Data Warehouse
- Migration of ETL processes from different data sources to Hive to test the easy data manipulation.
- Responsible for developing PIG Latin scripts.
- Created HBase tables to store various data formats of data coming from different sources.
- Trained and mentored analyst and test team on Hadoop framework, HDFS, Map Reduce concepts and Hadoop Ecosystem.
- Responsible for creating documentation.
- Helped IT infrastructure team to size the cluster.
Environment: Cloudera 4/5, MapR, Hive, HBase, Pig, Sqoop, Oozie, Java, Flume, Linux, Tableau, Oracle, SQL Server, MYSQL
Confidential
Hadoop ConsultantResponsibilities:
- Responsible for importing data from external systems log data in to HDFS
- Developed MapReduce jobs and Pig scripts
- Developed Map - Reduce programs to optimize ‘writes' and parse data in HDFS obtained from various data sources.
- Managing and scheduling jobs using Oozie
- Responsible for designing and managing Sqoop jobs that load data from Oracle to HDFS
- Wrote Hive generic UDF's to perform business logic operations at record level
- Involved in business requirement gathering and analysis
- Responsible for writing Pig scripts and Hive queries
- Created Hive managed and external tables to store processed data
- Responsible for creating Hive tables, partitions, bucketing, loading data and writing hive queries
- Created Hive tables which indeed partitioned by date which further processed results in a tabular format.
- Created Hive internal/external tables with proper static and dynamic partitions.
- Using Hive analyzed unified historic data in HDFS to identify issues & behavioral patterns.
- We Analyzed data and piped out to Tableau for Visualization and Reports
Environment: Cloudera 4/5, MapR, Hive, HBase, Pig, Sqoop, Oozie, Java, Flume, Tableau
Confidential
Information Technology Officer
Responsibilities:
- Handling all communication, documentation, field data/ information collated by Campaign Directors,
- Directed operations related to and met the needs of Coordinators and Campaign Directors from all 17
- Coordinated all Information Technology Equipment & Operations in the Umuahia Office.
- Give relevant advice and possible solution to IT & operations related issues.
- Worked with & out-sourced to IT professionals to meet operations/campaign demands.
- Coordinators and Field Workers in the Campaign office
Confidential
Engineering Data Specialist/ Maintenance Operations Assistant
Responsibilities:
- Collection and preparation of seismic data acquired with various engineering equipments
- Getting such data ready and sending same for analysis to the Seismology department
- Maintenance and Operation of engineering Equipment database
- Tracking, Receiving entry of electronic data for all day-to- day Engineering work on Mysql Database
- Preliminary Knowledge of the 408UL and 428XL SERCEL Equipment for FDU troubleshooting
- Assist Engineers in the preparation of monthly reports
- Keeping count of Equipments in the field by means of excel spreadsheet
- Assist Engineers in the repair, maintenance and installation of Equipments, Geophones, Computers, Radios etc.
Confidential
Graduate Trainee
Responsibilities:
- Assist with bulk money handling in the Operations Unit.
- Receive, count and record bulk cash collected daily.
- Balancing accounts daily without recording any shortage or overage
- Assist in marketing Confidential e-banking and retail products to customers
- Take part in selecting and loading money in the ATM
- Assist in resolving customer needs or complaints in the Customer Service Unit
- Monitoring operation of ATM under the supervision of the cash Officer.
Confidential
Industrial Trainee
Responsibilities:
- Systems Maintenance and Installation
- Computer Networking and maintenance for all departments
- Involved in resolving network problems in all departments
- Trouble shooting all computer systems and peripherals
- Involved in Programming Units activities