We provide IT Staff Augmentation Services!

Big Data Developer Resume

5.00/5 (Submit Your Rating)

CA

SUMMARY

  • 11 Years of professional experience in IT, including 3 years of extensive experience in Hadoop Eco System.
  • Has been working on Big Data/ Hadoop for last 3 years with good experience to all stages of SDLC.
  • Build business solutions on Hadoop Platform for different Machine Log, Device Log, and Banking Transaction Log.
  • Proposed solutions for Predictive Analytics & Machine learning for Failure Predication, Customer Behavior Analytics and machine learning algorithm implementation.
  • Architected next generation EDW architecture for a banking and E - Commerce Customers using Hadoop, Hive and NoSQL.
  • Extensively worked on click-through rate prediction, User behavior inference, sentiment classification and user recommendation.
  • Worked on various distributions like CLOUDERA,HORTONWORKS &APACHE
  • Designed and developed comprehensive ETL framework using Hadoop and Hive.
  • Delivered multiple deep dive technology and sales presentation for Customers on these platforms
  • Hands-on Experience on HDFS, Hadoop-1.2, map reduce, hive, base, sqoop, flume, pig, Pentaho, hbase, mango dB, R,cloudera,cassandra….
  • Hands on experience in reporting tool Pentaho and statistical language R.
  • Worked with cloud services like Amazon web services (AWS) and Google cloud.
  • Experience in implementing complete BI cycle involving SSIS and SSRS.
  • Experience in writing simple/complex queries, T-Sql, Views, Stored Procedures, Functions,Triggers,Cursors,Indexes and Constraints etc.
  • Experience in creating simple/complex ETL packages (SSIS) by implementing various transformations as per the business user requirements
  • Experience in developing packages and load the data into data warehouse by using SSIS containers, tasks and transformations. Used tasks like Executive Sql task, Data flow task, Send mail task, File system task, Script task etc. and the transformations like Derived column, Conditional split, Row count, Lookup, Data conversion, Union all etc.
  • Experience in creating various types of reports using SQL Server Reporting Services Tool (SSRS).
  • Experience on doing Database backups & Restores, Creating & Scheduling the jobs, Trouble Shooting the errors raised by the jobs etc. and giving permissions to the end users as per the request.

PROFESSIONAL EXPERIENCE

Confidential, CA

Big Data Developer

Responsibilities:

  • Worked with architecture/engineering leads and other teams on capacity planning.
  • Loaded the data from SQL SEREVR to Hive using Sqoop.
  • Pulled the daily data from websites to Hadoop cluster by using Flume.
  • Wrote Map Reduce code to make un-structured data into semi-structured data and loaded into Hive tables.
  • Wrote Fuzzy logic or Lookup to match Address in the both the files and implemented UDF’s in Hive.
  • Created complex hive table and executed complex hive queries on Hive warehouse.
  • Created components like hive UDFs for missing functionality in HIVE for analytics.
  • Installation & configuration of a Hadoop cluster along with Hive.
  • Developed Optimized Pig Scripts which reduced the execution time of traditional batch jobs by 80% of time.
  • Managing and scheduling Jobs on a Hadoop cluster using Oozie.
  • Good understanding of AVRO and Json.
  • Generated Business reports by using Tableau.

Environment: Java, Hadoop (Cloudera and Apache), Flume, Pig, Hive, Tableau, Spark, Splunk, Pentaho and Linux.

Confidential, Bentonville, Arkansas.

Big Data Developer.

Responsibilities:

  • Worked with architecture/engineering leads and other teams on capacity planning.
  • Developed generic scripts for loading data from UNIX file system to HDFS.
  • Used flume NG to integrate signals data from different regions and load them into Hadoop boxes.
  • Converting the flume data to structure format with the help of PIG scripts&Map Reduce programs
  • Applying the machine learning algorithms for clustering and classification.
  • Loading the data into hive tables by implementing blocking and matching algorithms.
  • Managing and scheduling Jobs on a Hadoop cluster using Oozie.
  • Used Hive-HBase integration for better storage & Performance.
  • Data visualization reports by using Tableau/Qlikview.

Environment: java, Cloudera Hadoop, Flume, Sqoop, Pig, Impala, Hive, HBase, Cassandra,Tableau and Linux.

Confidential, San Francisco,CA

Hadoop Developer

Responsibilities:

  • Created Hive external tables and managed tables, designed data models in hive.
  • Implemented business logic using Pig scripts
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Analyzed large amounts of data sets in hive and integrated hive with tableau
  • Performance tuning of hive queries written by other developers
  • Managing and scheduling Jobs on a Hadoop cluster using Oozie.
  • Developed generic scripts for loading data from UNIX file system to HDFS.
  • Participate in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meetings with various business users.
  • Have deep and thorough understanding of ETL tools and how they can be applied in a Big Data environment

Environment: Java, Cloudera, Sqoop, Pig, Hive, SQL Server, impala, Tableau and Linux.

Confidential

MSBI Developer

Responsibilities:

  • Understanding the Business and its Requirements.
  • Involved in designing, coding, testing for SSIS Packages and SSRS Reports.
  • Extensively worked in various transformations to implement the business requirements on the SSIS Packages.
  • Involved in creating package configurations, Checkpoints etc.
  • Hands on Debugging, Error Handling and Deploying the SSIS packages.
  • Hands on Security Implementation in SSIS packages.
  • Involved in Performance tuning of stored procedures.
  • Hands on writing Complex T-SQL, Views and stored Procedures etc.
  • Involved in Creating,Deploying & Scheduling reports in different formats as per the user requirements using SSRS.

Confidential

MSBI Developer

Responsibilities:

  • Development of customized SSIS transformations
  • Involved in understanding the BRD (Business Requirement Document) which includes mapping document with database, look and feel of the report.
  • Design and development of database scripts and procedures
  • Develop the package for ETL process (First Run and Incremental)
  • Involved in creation of the package and was responsible for moving the package from Development Server to Production Server.
  • Job Log Report Development using SSRS.
  • Participating in discussions with client regarding the changes or modifications in Reports.
  • Created Reports by defining prompts and filters as per requirement.
  • Responsible for taking the Report Subscriptions.

Environment: SQL Server 2005(SSIS, SSRS).

We'd love your feedback!