We provide IT Staff Augmentation Services!

Subject Matter Expert And Technology Analyst Resume

5.00/5 (Submit Your Rating)

New York, Ny

SUMMARY:

  • 8 years of extensive hands on experience in IT industry, with expertise on Big Data Technologies.
  • Worked as a Subject Matter expert and Technology Analyst for Royalties Department.
  • Excellent Knowledge in understanding Big Data infrastructure, distributed file system HDFS,parallel processing - MapReduce framework and complete Hadoop ecosystem - Hive, Pig,Sqoop, Oozie and Flume
  • Performed and extensively used various types of ETL Activities using PIG,HIVE and SQOOP scripts.
  • Knowledge in job/workflow scheduling and monitoring tools like Oozie, Zookeeper and working experience on distributed data processing systems.
  • Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning, custom configurations, monitoring and maintaining using different hadoop distributions: HortonWorks HDP Apache Hadoop, AWS Hadoop.
  • Working Knowledge in architecting Hadoop solutions including hardware recommendations, network topology design, storage configurations, benchmarking, performance tuning, administration and support.
  • Experience in setting up the monitoring and alerting systems using the open source tools like Ganglia and Nagios.
  • Experience in developing ETL process using Map-Reduce Framework in java.
  • Experience in working with different relational databases like MySQL, MS SQL and Oracle.
  • Expertise in database interfacing technologies: Cassandra, HBase, MongoDB.
  • Interacted with the Business Users & System engineering team for requirements gathering & scoping and assisted them to solve problems.
  • Played vital role in Project migration by using Data Structures as an ETL to migrate the data from Source to Target system.
  • Proficient in Analyzing, Coding, Testing and implementation of COBOL Programming.
  • Substantial development experience in creating stored procedure, PL/SQL Packages.
  • Experience and Knowledge in design and deployment of Unix Shell scripting Programming.
  • Strong Judgment, Analytical, Communication and Documentation skills in all phases of SDLC Process.
  • Collaborating with business users/product owners/developers to contribute to the analysis of functional requirements.
  • Experience and strong knowledge regarding the best practices of the Agile Methodology.
  • Experience in troubleshooting, finding root causes, Debugging and automating solutions for operational issues in the production environment
  • Handling Development, Testing and Support activities by involved in project estimation and project scheduling.
  • Mentoring and training project members to enable them to perform their activities effectively.

TECHNICAL SKILLS:

Operating System: OS/390, Windows 2003, Windows XP, Unix, RedHat/CentOS

Environment: Windows, Unix and Mainframe

Database: PL/SQL, Oracle, DB2, Cassandra, Mongo DB, Hbase

ETL: SAP Data Services

Languages: ACUCOBOL, COBOL, JCL, SQL,Data Structures, Unix Shell Script

Hadoop: HDFS, MapReduce, Hive, HBase, Impala, Hue, HCatalog, Pig, Sqoop, Flume, Oozie, Zookeeper

Hadoop Cloud: AWS, Rackspace, S3

Hadoop Monitoring & Management: Ganglia, Ambari, Nagios

Hadoop Distributions: Hortonworks HDP 1.3, Hortonworks HDP 2.0, AWS, Cloudera

PROFESSIONAL EXPERIENCE:

Confidential, NEW YORK, NY

Subject Matter Expert and Technology Analyst

Responsibilities:

  • Installation and Administration of a Hadoop cluster. Debugging and troubleshooting the issues in development and Test environments.
  • Installed and configured various components of Hadoop ecosystem and maintained their integrity
  • Commissioned data nodes as per the cluster capacity and decommissioned when the hardware degraded or failed.
  • Experience in collecting metrics for Hadoop clusters using Ganglia and Ambari.
  • Designed, configured and managed the backup and disaster recovery for HDFS data.
  • Designed workflow by scheduling Hive processes for Physical Sales file data, which is streamed into HDFS using Flume.
  • Design & Develop ETL workflow using oozie for business requirements, which includes automating the extraction of data from MySQL database into HDFS using Sqoop scripts.
  • Developed Map reduce programs to extract and transform the data sets and results loaded to Cassandra database.
  • Performed data analytics in Hive and then exported this metrics back to Oracle Database using Sqoop.
  • Performed systems monitoring, upgrades, performance tuning and backup and recovery.
  • Proactively involved in ongoing maintenance, support and improvements in Hadoop cluster.
  • Performed migration of data across clusters using DISTCP.
  • Involved in implementing High Availability and automatic failover infrastructure to overcome single point of failure for Namenode utilizing zookeeper services.
  • Worked with Rackspace consultant team in administering the server hardware and operating system.
  • Worked with big data developers, designers and scientists in troubleshooting map reduce job failures and issues with Hive, Pig and Flume.
  • Preformed Linux server administration to the server hardware and operating system.

Technologies: Hortonworks Data Platform (HDP1.3 & HDP 2.0),RHEL 5.8 and 6.3, Hive, Pig Latin,Ambari,SQOOP, Hbase, Cassandra, AWS, S3 and ParAccel

Confidential

Data Analytics

Responsibilities:

  • Configured and built multimode cluster and installed Hadoop Eco system softwares such as HDFS, MapReduce, HBase, Pig, Sqoop, Spark and Hive.
  • Assisted in designing, development and architecture of Hadoop systems.
  • Used scripting language (Shell) to assist in automating the database administration and monitoring tasks.
  • Preparing and bulk loading the daily collected incremental data from source systems to Cassandra column families.
  • Validating the data using the swagger API URL by providing the key for a corresponding column family.
  • Automated the entire workflow from Data preparation to presentation layer for Confidential project.
  • Automating the entire lifecycle process and snapshot feature.
  • Installed and configured Hive, Pig, Sqoop and Oozie on the HDP Cluster.
  • Worked with big data developers, designers and scientists in troubleshooting map reduce job failures and issues with Hive, Pig and Flume.

Technologies: Hortonworks Data Platform (HDP1.3 & HDP 2.0), Hive, Pig Latin,Ambari,SQQOP, Cassandra and ParAccel.

Confidential

Responsibilities:

  • Preparation of Requirements Analysis, Impact Analysis and Test Results Documents.
  • Preparation of Traceability Matrix.
  • Review of Functional Specifications.
  • Coding/Fixing based on the impact of SAP migration.
  • Preparation of test cases and execution (Unit Test Plan and Unit Test Results).
  • Direct interaction with Business Manager and Business Team.
  • Attend peer reviews and internal inspections.
  • Mentoring the new joinees in the team.
  • Reviews and tracking of the work done by new joinees.

Technologies: ACUCOBOL, UNIX Shell Scripting, PL/SQL, Oracle 8i, Data Services.

Confidential

Subject Matter Expert and Technology Analyst

Responsibilities:

  • Preparation of Requirements Analysis, Impact Analysis and Test Results Documents.
  • Creation of Process documents for the the adhoc tasks from business.
  • Preparation of Traceability Matrix.
  • Coding/Fixing of production incidents/adhoc tasks.
  • Preparation of test cases and execution (Unit Test Plan and Unit Test Results).
  • Direct interaction with Business Manager and Business Team.
  • Attend peer reviews and internal inspections.
  • Preparation of Reconciliation of data processed/migrated during the Migration Projects.

Technologies: ACUCOBOL,UNIX Shell Scripting,PL/SQL, Oracle 8i

We'd love your feedback!