We provide IT Staff Augmentation Services!

Big Data Analyst Resume

SUMMARY:

  • Over 14 years of extensive experience in IT including 7 + years of experience in managing all aspects of data science and Big Data: data design, data mining, hypothesis testing, statistical modeling, machine learning algorithms and predictive analytics in consumer behavior, product development and product pricing.
  • Experience in ensemble methods like boosting, bagging, linear and logistic regression and random forest for building models and machine learning
  • Experience in Funnel Analysis using Markov Chains with Customer level data to find the best paths in machine learning.
  • Experience in TensorFlow, Neural Network (CNN, RNN).
  • Experienced in deploying enterprise data management solutions and consolidating multiple data Sources to unified single interface for management decision support system consisting of dashboard analytical reporting engine.
  • Experience in designing and implementing NoSQL database stores such as MongoDB, HBase, Cassandra
  • Experience in analyzing data using Hive QL, PIG Latin, and custom Confidential programs.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database Systems / mainframe and vice - versa.
  • Experience in development, implementation and testing of Database projects.
  • Background with traditional databases such as Oracle, DB2, SQL ServerETL tools / processes and Data Warehousing architectures.
  • Strong knowledge in all phases of Software Development Life Cycle
  • (SDLC) Commended for technical, analytical and problem-solving skills; effective task prioritization on various engineering disciplines to troubleshoot complex system-level issues.
  • Strong Communication skills of written, oral, interpersonal and presentation
  • Ability to perform at a high level, meet deadlines, adaptable to ever

TECHNICAL SKILLS:

Data/Statistical packages: R, PythonSpark, SparkR, Pandas, Scikit, TensorFlowMachine Learning, Map Reduce

Big Data Ecosystem: HDFSHBase, Hadoop Confidential, AmbariZookeeper, Hive, Pig, Sqoop, FlumeYARN, Oozie, Tableau, Storm, SparkMahout, Scala, Junit, Cassandra, ImpalaKafka, Hue,JSON, XML, Devops, ChefJenkins,Git, knife, Logstash, KibanaVagrant, Ruby.

Languages: C, C++, Java, J2EECOBOL, JCL, CICS, And SQL/PLSQLRuby

Methodologies: Agile, V - model and Waterfall model.

Database: Oracle 10g, PL/SQLDB2, MySQL, Cassandra, MongoDBHBase, IMS DB, VSAM, MS SQL server, Amazon EC

Web Tools: HTML, Java Script, X

Languages: C, C++, Java MLODBC, Java Beans, EJB, MVC, AjaxJSP, Servlets, Java Mail, StrutsIDE / Testing Tools Eclipse.

Operating System: Windows, UNIXLinux, MVS, IBM Z/OS.

Scripts: Shell Scripting, Python, Scripting, JavaScript.

Distribution: Cloudera and Hortonworks

PROFESSIONAL EXPERIENCE

Confidential

Big Data Analyst

Responsibilities:

  • Experienced in deploying enterprise data management solutions and consolidating multiple data Sources to unified single interface for management decision support system consisting of dashboard analytical reporting engine.
  • Experience in designing and implementing NoSQL database stores such as MongoDB, HBase, Cassandra
  • Experience in analyzing data using Hive QL, PIG Latin, and custom Confidential programs.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database Systems / mainframe and vice - versa.
  • Experience in development, implementation and testing of Database projects.
  • Background with traditional databases such as Oracle, DB2, SQL ServerETL tools / processes and Data Warehousing architectures.
  • Strong knowledge in all phases of Software Development Life Cycle
  • (SDLC) Commended for technical, analytical and problem-solving skills; effective task prioritization on various engineering disciplines to troubleshoot complex system-level issues.
  • Strong Communication skills of written, oral, interpersonal and presentation
  • Ability to perform at a high level, meet deadlines, adaptable to ever

Hire Now