- 8+ years of experience in all IT phases including application design, development, production support & maintenance projects.
- Good Managerial Skills & Self - reliant and able to work independently under the minimum supervision
- Focuses on results and achieving of the targets.
- Good written communication skill, with analytic capacity and ability to synthesize project outputs and relevant findings for the preparation of quality project reports.
- Organization skills and time management.
- Good hands on experience of Hadoop and various components such as HDFS, JobTracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.
- Implemented Hadoop based Data warehouses, integrated Hadoop with Enterprise Data Warehouse systems.
- Experience on Apache Hadoop Map Reduce programming, PIG Scripting, Distribute Application
- Good understanding of NoSQL Data bases.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Good experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java.
- Good experience in Web/intranet, client/server technologies using Java, J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.
- Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
- Hands on experience in application development using Java, RDBMS.
- Extensively worked on database applications using DB2, Oracle 11g/10g, SQL.
Instructor & Big data Hadoop Developer
Confidential, Dallas, TX
- Work with students who are studying for a degree or a certificate or certification or are taking classes to improve their knowledge or career skills.
- Develop an instructional plan (known as a course outline or syllabus) for the course(s) they teach and ensure that it meets institute and department standards.
- Plan lessons and assignments.
- Assess students’ progress by grading papers, tests, and other work.
- Advise students about which classes to take and how to achieve their goals.
- Hadoop development and implementation.
- Loading from disparate data sets.
- Pre-processing using Hive and Pig.
- Translate complex functional and technical requirements into detailed design.
- Maintain security and data privacy.
- Managing and deploying HBase.
Broadcast & Studio Engineer & Big data Hadoop Developer
- Maintain and operate all studio electronic equipment.
- Install new and used equipment.
- Maintain and repair ENG equipment.
- Receive feeds from SNG vehicles.
- Maintain and repair satellite dishes and related equipment.
- Assist in the design of any improvements to facilities and operations.
- Maintain documentation of installation, repair and upgrades.
- Maintain and repair sound system.
- Maintain and operate all studio electronic equipment(Lights, cameras).
- Dealing with equipment brand (Grass Valley, Ross, nivada, 360 system, ForA, ORAD-VDI, Morpho 3D, Reuters receiver and signal distributed).
- Install new and used equipment.
- Received signals from different sources throw the Ingest system and deal with it throw the deckling card and Slingbox and Receive feeds from SNG vehicles.
- Maintain and repair sound system (audio mixer, Speakers, Microphones, Ear Pieces, encoder audio coaction, studio audio wiring).
- Perform other duties as needed in the MCR (Master Control Room).
- Monitoring and dealing with graphic on air equipment.
- Troubleshooting system and network problems and diagnosing.
- Installing and configuring computer hardware operating systems and applications.
- Classify the storage data around 500 GB every day and distrusted.
- Desktop support and Windows 7 support.
- Involved in installing Hadoop Ecosystem components.
- Developed Scripts and Batch Job to schedule various Hadoop Programs.
- Worked on Hadoop cluster which ranged from 8-10 nodes during pre-production stage.Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Established custom MapReduce programs in order to analyze data and used Pig Latin to clean unwanted data.
- Used Sqoop to import the data from RDBMS to Hadoop Distributed File System (HDFS) and later analyzed the imported data using Hadoop Components.
- Created Hive tables, then applied HiveQL on those tables, this will invoke and run MapReduce jobs automatically
- Performed loading and transforming large sets of Structured, Semi-Structured and Unstructured data and analyzed them by running Hive queries and Pig scripts.
- Installed Oozie workflow engine to run multiple Hive and Pig jobs which run independently with time and data availability
- Performed Hadoop installation, updates, patches and version upgrades when required
Environment: CDH3,CDH4, PIG(0.8.1), HIVE(0.7.1), Sqoop (V1),Java, Eclipse, Flume, Zookeeper, Oozie, Oracle 11g, SQL Server 2008, HBase, Oracle 11g / 10g.
Audit & monitoring & Big data Hadoop Developer
- Full time work on controlling Refugee Assistance Information System (RAIS) that include case management, assistance, group assistance, home visit, health assistance, referral, reporting and survey tools .
- Fieldwork experience / Oman Embassy 2012 cash grant to Iraqi and Syrian Refugee in Jordan.
- Outreach Field monitoring visits.
- Monitoring and auditing the work of the Data Entry Specialist group teem.
- Full time work on the Refugee Assistance Information System (RAIS) that include case management, assistance, group assistance, home visit, health assistance, referral, reporting and survey tools .
- Fieldwork experience Oman Embassy 2012 cash grant to Iraqi and Syrian Refugee in Jordan.
- Managing project filing, document and keeping copies of documents.
- Survey and analysis.
- Data entry and keeping them updated.
- Working on periodic reports based on statistical information.
- Involved with ingesting data received from various providers, on HDFS for big data operations.
- Loaded and transformed large sets of structured, semi structured and unstructured data in various formats like text, zip, XML and JSON.
- Supported MapReduce Programs those are running on the cluster.
- Installed and configured Pig and also written Pig Latin scripts.
- Imported data using Sqoop to load data from Oracle to HDFS on regular basis or from Oracle server to Hbase depending on requirements.
- Wrote Hive queries for data analysis to meet the business requirements. Created Hive tables and working on them using Hive QL.
Presentations provider & Data analyst
- Review and analyze market surveys undertaken by the various field teams on the cost of materials and labor for construction and rehabilitation activities and advise the Pricing Expert on the behavior of the market and the expected effect on the project budget.
- For any new bids, check the Bills of Quantities, Terms of Reference and work plans for the rehabilitation and construction of infrastructure, to ensure that they meet the needs/priorities of the clients.
- Supervise the preparation of technical specifications for goods and equipment and lead on procurement processes for new bids.
- Developing /enhancing applications built on the Java platform..
- Creating technical documentation (assets) in the code per coding standards and assisting as needed in the documentation creation for the products' customers.
- Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
- Involved in complete requirement analysis, design, coding and testing phases of the project.
- Participated in JAD meetings to gather the requirements and understand the End Users System.
- Generated XML Schemas and used XML Beans to parse XML files.
- Created Stored Procedures & Functions. Used JDBC to process SQL Server databases.
- Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
- Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
- Developed the interfaces using Eclipse 3.1.1.