We provide IT Staff Augmentation Services!

Sr Bigdata/hadoop Tech Lead. Resume

4.00/5 (Submit Your Rating)

NJ

SUMMARY:

  • A versatile and skilled IT professional specializing in AWS and Big data Hadoop with over 12 years and 1 month of rich experience.
  • 3 years and 5 months of experience in Big Data and Hadoop, HDFS, MapReduce, Spark, Hive, Sqoop, Hue and Flume.
  • Good experience in Amazon web services (AWS) - Compute (EC2 and Lambda), Networking & Content Delivery (VPC), Storage (S3 and Glacier), Database (RDS and Redshift), Security, Identity & Compliance (IAM), Analytics (EMR, Athena and Data Pipeline), Messaging (SNS and SQS), Application services (SWF) and Artificial Intelligence (Alexa/Lex/Polly).
  • Sound experience in Open Stack services and Open Stack CLIs & rest APIs (of Keystone, Nova, Neutron, Glance, Swift services).
  • Good experience in core java with developing REST API.
  • Extensive experience in Software Analysis, Design, Coding, Unit & Assembly Testing, Execution and Delivery of complex projects using various cloud and core technologies.
  • Very good experience in languages such as C, C++ and Core Java.
  • Good experience in handling databases like Oracle, PSQL and MySQL.
  • Good experience in Shell scripting and PHP scripting.
  • Good experience in team & product management, estimation, planning and design/code reviews.
  • Good experience in Agile (Scrum) process.
  • Ability to meet deadlines and handle multiple tasks, flexible in work schedules and possess good communication skills, effective in working independently and collaborative in teams.
  • Good working knowledge with designer tools such as Microsoft Visio.
  • Functional Domain experience involves Logistic and Transportation, Middleware, Networking, Enterprise Security, Aerospace, Cloud Computing and Healthcare systems.
  • Good experience in interacting with Business Analysts, CRM and testing team. Experience in team leading, client interaction and status reporting.
  • Expertise in Handling Change management requests and instant management request to implement the requested changes on QA/production.

TECHNICAL SKILLS:

Operating Systems: Windows, Linux, Unix and NSK

Programming Languages: C, C++, VC++, Core Java, PHP, Python, Scala

Scripting Languages: Unix Shell, PHP, Python

Big Data Technologies: Hadoop, HDFS, Map Reduce, Hive, Sqoop, Spark, Hue, Flume

AWS Services: EC2, Lambda, VPC, S3, Redshift, RDS, IAM, EMR, Data Pipeline, SNS, SQS, SWF, ALEXA/LEX, Glacier, Athena, Polly

RDBMS: Sybase 9.0/10.0, Oracle 8i/9i, PSQL, MySQL

Multithreading Technologies: RPC, IPC, POSIX Multithreading

Rest APIs: DoubleClick, Google Analytics and Eloqua

Version Control and Defect Tracking Tools: Microsoft Visual Source Safe, Perforce, Clear Case, Star Team, Stash/GitHub (Source Tree), Mercury Quality Center, JIRA

Other Development Tools and Technologies: Eclipse, Microsoft Visual Studio 6.0/8.0, GDB, DBX, eGarth/eInspect, GenWeb, PSPad, Source Navigator, Beyond Compare, Win Merge, Microsoft Visio, Crystal Report, CouldBerry, WinSCP, SQL/MySQL Workbench, Rational Rose Purifier

PROFESSIONAL EXPERIENCE:

Confidential, NJ

Sr Bigdata/Hadoop Tech Lead.

Responsibilities:

  • Interact with business/SMEs to support Data Collection from difference datasets / data sources. Involve in the design and development for the Data Collection using Amazon Web Services S3.
  • Design and build Map Reduce programs in Hadoop - Big data for transforming the collected data which is stored in Amazon S3 location.
  • Design and develop the AWS Elastic Map Reduce Clusters to run the Hadoop Programs.
  • Develop the rules for the different dataset / data sources required for the Transformation.
  • Design and develop the program for loading the transformed data into the Amazon Redshift Database for different dataset / data sources.
  • Build the pipeline for Collection, Transformation and Loader process.
  • Design and develop the data analytics report using spark framework.
  • Design using the Amazon Simple Work-flow component for the Data Orchestration.
  • Execution of Technical Unit Testing (TUT)and recording results.
  • QA, Compliance Validation and Sign-off and cut-over planning and execution.
  • Assignment of tasks among the team members.
  • Drafting initial charter, project plan, coordinating efforts for completing activities in plan.
  • Providing regular status reports for all activities related to the project.
  • Identifying and resolving issues & mitigating risks.

Environment: Amazon Web Services (AWS) Cloud, Java, Python, PHP, Big Data, Hadoop (MapReduce, HDFS), Sqoop, Hive, Spark, Hue, Amazon Web Services (EC2, Lambda, S3, EMR, SQS, SNS, SWF, RDS, RedShift, Alexa/Lex, etc.), Talend, Eclipse, SQL/MySQL Workbench, Putty, CloudBerry, WinSCP, Unix Shell Scripting, Google Analytics, Google DoubleClick, Eloqua REST APIs.

Confidential

Sr Bigdata/Hadoop Tech Lead

Responsibilities:

  • Requirements analysis, design, coding and unit testing.
  • Analysis of Plugins for different kind of Logs for Logstash and Fluentd.
  • Analysis of Logstash filters.
  • Writing of use cases for different type of logs.
  • Writing of customize plugins.
  • Evaluation of the performance of Logstash and Fluentd for real time logs analysis.
  • Verification of the work around for the role based view capability in Kibana.

Environment: OpenStack, Amazon Web Services (AWS), Kibana, Logstash, Elastic Search, HDFS, Hive, Java, Linux, Putty, Eclipse, CloudBerry, WinSCP, Unix Shell Scripting.

Confidential

Tech Lead

Responsibilities:

  • Requirements analysis, design, coding and unit testing.
  • Analysis of Spatial Index and LP solver libraries.
  • Power lines network and elevation points parsing and traversal.
  • Creation of spatial index data model and indexing using spatial index.
  • Solving the algorithm constraints using LP solver.
  • Correlation between the terrain data with network data.

Environment: AWS, Linux, C++, Data Structures, Python Scripting, GNU Make, Putty, Eclipse, CloudBerry, WinSCP, Unix Shell Scripting.

Confidential

Tech Lead

Responsibilities:

  • Requirement analysis.
  • Protocol and deep packet analysis.
  • Application filter design, implementation and unit testing.
  • Preparation of application filters artifacts.

Environment: Windows, TCP/IP Protocol Stacks, Perl Scripting, Wireshark.

Confidential

Tech Lead

Responsibilities:

  • Requirements analysis and design for new RFEs.
  • Coding, critical bug fixing and resolving customer issues.
  • Design and code inspections meeting with client.
  • Customer communication for various issues and queries.
  • Assignment of tasks among the team members.
  • Drafting initial charter, project plan, coordinating efforts for completing activities in plan.
  • Providing regular status reports for all activities related to the project.
  • Identifying and resolving issues & mitigating risks.
  • Working with Project Manager to ensure resource workload is balanced across project.

Environment: NSK, C, C++, eGarth/eInspect, GenWeb, Source Navigator, Star Team.

Confidential

Senior Developer

Responsibilities:

  • Requirements analysis and design for new RFEs.
  • Coding, critical bug fixing and resolving customer issues.
  • Design and code inspections meeting with client.
  • Customer communication for various issues.
  • Assignment of tasks among the team members.
  • Drafting initial charter, project plan, coordinating efforts for completing activities in plan.
  • Providing regular status reports for all activities related to the project.
  • Identifying and resolving issues & mitigating risks.

Environment: NSK, C, C++, TAL, Putty, eGarth/eInspect, GenWeb, Source Navigator, Star Team, Multi-Threading, Sockets.

Confidential

Senior Developer

Responsibilities:

  • Requirements analysis, planning and estimation for enhanced features.
  • Design, coding and unit testing.
  • Internal and external code review with client.
  • Preparing the unit and assembly test design and test report.
  • Execution of assembly testing.
  • Bug fixing and solving production issues.
  • Completely leading the Rating and Admin modules of the application.
  • Identifying and fixing performance related issue.
  • Preparing the functional flow project documents using UML diagrams.
  • Team support activities on various production issues.

Environment: Windows, C++/VC++, Sybase 9.0/10/0/11.0, Microsoft Visual Studio 6.0/8.0, Microsoft Visio, Multi-Threading, RPC, Webservice Client, Crystal Report, Microsoft VSS, Star Team, PVCS.

Confidential

Developer

Responsibilities:
  • Requirements analysis for missing test drivers.
  • Developing test drivers for the missing components.
  • Porting of components as well as test drivers from Aix to Linux.
  • Shifting oracle 9.2.0 client to oracle 10.2.0 client.
  • Fixing memory leaks using rational rose purifier.
  • Preparing the unit and assembly test design and test report.
  • Unit, assembly, system/integration and performance testing.
  • Bug fixing and solving production issues.

Environment: Red Hat Linux 8.0, C++, Oracle 9i, GDB, Microsoft VSS, Clear Case, Putty Server, Rational Rose Purifier.

Confidential

Developer

Responsibilities:

  • Requirements analysis, design, coding and unit testing.
  • Internal and external code review with client.
  • Bug fixing.

Environment: Unix, C++, Oracle 8i, GDB, Microsoft VSS, Clear Case.

We'd love your feedback!