We provide IT Staff Augmentation Services!

Data Analyst Resume

2.00/5 (Submit Your Rating)

Southfield, MI

SUMMARY

  • 9+ years of IT experience in Automobile, Banking and Insurance domain, which includes hands on experience of 3+ years in Big Data technologies and 5+ years in Mainframes
  • Experience in designing, reviewing, implementing and optimizing data transformation processes in the Hadoop ecosystems
  • Key participant in all phases of software development life cycle
  • Good experience in working with Spark Framework along with Scala language
  • Excellent understanding of Hadoop architecture and core components
  • Expertise in writing Hadoop Jobs for analyzing data using Hive QL (Queries), Scala (Data flow language), Spark Streaming and Sqoop (Data Ingestion), Alteryx and shell scripts
  • Created Dashboards using Qlikview and Tableau
  • Experienced in working in Hortonworks and Cloudera
  • Experience in developing the projects using Waterfall model as well Agile Methodology and performing Scrum Master role
  • An excellent team player with good communication, analytical and problem - solving skills

PROFESSIONAL EXPERIENCE

Confidential, Southfield, MI

Data Analyst

Responsibilities:

  • Identified and enhanced new and existing data sources (third party data, warranty, safety and Manufacturing) to optimize business opportunities
  • Migrated Confidential data from SAS platform to Hadoop and made it compatible in order to get safety insights
  • Extracted, transformed, integrated and cleansed data from Hadoop databases for measuring KPI and generating business reports using Alteryx, Scala, HQL
  • Developed UDFs in Scala and Java
  • Created data visualization using Qlikview by defining data requirements, gathering and validating information in Alteryx, Scala, HQL
  • Developed Scala programs using Data frames and RDD in Spark for Data Aggregation, queries and writing data back into Hadoop
  • Extracted and parsed JSON and XML data from Kafka topics using Spark Streaming
  • Worked with Hive data warehouse tool to create tables by implementing partitioning and bucketing
  • Optimized Hive queries and Scala jobs
  • Created job workflow scheduling and monitoring using shell scripts and Oozie
  • Managed and reviewed Hadoop log files
  • Basic knowledge of working with Alteryx
  • Evaluated data accuracy, completeness, timeliness and established data enhancement, maintenance and security processes
  • Identified and implemented GDPR regulations on sensitive data

Confidential

Senior Software Engineer

Responsibilities:

  • Designed, developed and performed maintenance of data integration programs in a Hadoop and RDBMS(DB2)
  • environment for data access and analysis
  • Implemented business logic in Hadoop, Used UDFs
  • Imported and exported data between HDFS and Relational Database Management Systems using Sqoop
  • Worked with different operating systems Windows Variants, UNIX and LINUX
  • Experienced with Backend Database like DB2
  • Developed projects using Agile Methodology
  • Acted as Scrum Master for team of 11 members

Confidential

Associate IT Consultant

Responsibilities:

  • Involved in design, development, unit testing, review and implementation of Hadoop and Mainframe Projects
  • Performed requirement analysis and system testing
  • Worked at client location in Copenhagen, Denmark for 2 months
  • Imported/Exported the data using Sqoop
  • Developed Pig Latin scripts for Data transformations
  • Created Hive External tables and loaded the data in to tables and query data using HQL

Confidential

Senior Software Engineer

Responsibilities:

  • Worked for various Insurance and Banking Clients like Confidential
  • Involved in design, development, unit testing, review and implementation of Mainframe Projects

We'd love your feedback!