We provide IT Staff Augmentation Services!

Senior Hadoop Developer/big Data Resume

5.00/5 (Submit Your Rating)

Dallas, TX

PROFESSIONAL SUMMARY:

  • 12+ years of experience in BigData/Hadoop, J2EE / Java development, verification, validation and other aspects of the Software development life cycle. Multi - faceted experience with application of information and networking technology, emphasizing deployment and support across complex hybrid enterprise level ecosystems.
  • Extensive work on Analysis, Development, Integration, and maintenance of web-based and Client/Server applications using Java and Big Data technologies (HDFS, Spark, MapReduce, YARN, Pig, Hive, Hbase, Sqoop, Flume, Oozie).
  • Implemented Big Data migration for existing RDBMS (SQL) as part of IaaS offering using Data ingestion tools like Sqoop using Hortonworks HDP/Cloudera CDH distributions and Amazon Simple Storage Service (Amazon S3), Amazon EC2, AWS EMR, GCP and Splunk.
  • Experience in using automation tools: Puppet, Ansible, Load Runner, Shenick, Diversify, EXFO, Spirent, QTP, JUnit, Test NG, Selenium, SOAP UI, ETL, Talend DI.
  • Good understating of CI /CD processes and DevOps tools like Jenkins, Bamboo, Git, Bitbucket and Open source frameworks Spring, MVC, Hibernate, SOA Webservices etc.
  • Knowledge of various Network Element/North-Bound NMS operations systems with onsite experience at customer labs, service fulfillment, OAM & Lifecycle change management.
  • Enable dashboard and KPI reports to collect, monitor, analyze and troubleshoot software/configuration issues thereby ensuring their swift resolution.
  • Technical lead with excellent analytical and communication skills.

PROFESSIONAL EXPERIENCE:

Confidential, Dallas TX

Senior Hadoop Developer/Big Data

Responsibilities:

  • Implement Hadoop/big data platform for processing of large amount of structured & unstructured data. Architect new data process flow and ingestions for data lake, data reconciliation and visualization.
  • Architecture design, data modeling and implementation of Big Data platform and analytic applications using HDFS, Spark, Pig, Hive, MapReduce, Kafka, Storm, Java, Python, Scala
  • Creation of hive tables, loading with data and writing hive queries on Cloudera EDH based platforms. Design, model and create hive external tables using shared meta store with partitioning and bucketing.
  • Work on custom UDF’s in Hive and Pig using Java and Python. Used Spark to perform analytics on data in hive.
  • Creation of Oozie jobs as part of DevOps role. Responsibilities include monitoring, testing and troubleshoot failed job issues.
  • Implementing High Availability (HA) Resource Manager, Oozie, Hue, NoSQL, Hbase, Cassandra, HBase Thrift server, HiveMeta store, Kafka and Cron jobs.

Confidential, Plano, TX

Senior Hadoop Developer/Big Data Ericsson OSS Analytics

Responsibilities:

  • Processing large amount of network element data using Expert Analytics for calculating various key performance indicator / use cases for core customer network data (Performance, Fault management, Configuration, Events etc.), customer and Ericsson generated data (spreadsheets, reports, etc.), events for monitoring and troubleshooting the network.
  • Implement Hadoop/big data platform best practices for processing of large amount of structured and unstructured data using MapReduce and Hive.
  • Worked on exporting and importing data using Sqoop from HDFS/MapR-FS to MySQL database and vice versa.
  • Design and implementation of continuous build and integration/deployment process using Jenkins. Develop MapReduce jobs in Java for data preprocessing.
  • Configuration, upgrade of Hadoop ecosystems. Tuning Spark application using Python PySpark for the better performance in terms of reducing the processing time.
  • Develop, Test and implement proof of concept solutions using various big data tool/technologies as and when needed. Agile Tools used: Puppet, TestNG
  • Implementing High Availability (HA) Resource Manager, Oozie, Hue, NoSQL, Cassandra, HBase Thrift server, Hive Server2, HiveMeta store, Kafka, ETL, Talend DI and Cron jobs.
  • Knowledge sharing with the team members if any new functionality/changes added or planned to be added.
  • Managing the full life cycle of a Hadoop/Bigdata solution, including requirements analysis, platform selection and Technical design.

Confidential

Senior Hadoop Developer/Big Data

Responsibilities:

  • New Product Introduction (NPI) for OSS technology using Adtran AOE/TA500x Products to migrate to Big-Data Hadoop ecosystem.
  • Develop test-plans and MOP to integrate ECIL (Ericsson Common Integration Layer) and Adtran AOE. Setup virtualized environment in VMWare ESXi Hypervisor and RHEL (Red Hat Enterprise Linux) guest OS and JBOSS Application server.
  • Multitier enterprise architecture, installation, configuration, and System Verification testing, centric to Northbound integration (NBI) with OSS applications.
  • Configure High Availability (five 9’s), setup database and file replication between master and standby sites, switchover/failover, ingestion of data from sources to HDFS using Sqoop, Data warehousing, Data mart, ODS, and Data lake implementation.
  • Import data from various data sources, performed transformation using Hive, Pig and loaded data into HDFS for aggregation.
  • Develop Test Cases & Test plan for System test verification and create CI/CD pipelines using Jenkins for test automation and release. Test and troubleshoot ECIL Order care & catalog JAX-WS Web services using SOAP UI and backend integration with SQL on RDBMS databases.
  • Integrate/test Ericsson ECIL NMS - Adtran AOE EMS with northbound B/OSS systems for provisioning and performance management.
  • Saved 50% installation duration by automation & scripting, 30% configuration duration using provisioning flows, thereby improving project cost.

Confidential

Services Integrator

Responsibilities:

  • IP network transformation for Sprint NV project to deploy and launch 4G LTE sites using Ericsson NGN IP products and Network management systems.
  • Service delivery of IP based backhaul using Ericsson MINI-LINK portfolio, extensive work integrating SP210, SP310 and TN network elements on service providers MPLS backbone.
  • FCAPS management of SNMP based MINILINK nodes using ServiceOn EM in Linux environment. Monitor Alarms, Performance management, NE software upgrades, Backups on a 24X7 rotating schedule for managed services.
  • Handover the completed sites to Service Assurance team.

Confidential

Software Solutions Integrator

Responsibilities:

  • Customer Integrator/support for Ericsson EDA 1500/1200 GPON/IPDSLAM solution as part of AT&T project Lightspeed to deploy triple play services - Voice/Data/IPTV.
  • NPI (New Product Introduction) at multiple AT&T labs, NOC’s to introduce EDA 1500 GPON product line and its Element Management System (EMS). Integrate/support EMS TL1 Northbound Integration with customer’s legacy/new IT back-end systems.
  • Provide inputs to Hardware and software configuration gold standards and test specifications, perform testing and produce test result documents. Ticketing, escalation and resolution of any issues from end-to-end perspective to deliver Data/IPTV subscription services.
  • Worked on next generation carrier class products, Wire-line Broadband Access networks and different FTTX technologies last mile to the consumers
  • Application layer protocol knowledge of IP Networks, multicast features like IGMP, PIM-SM, Access Control using 802.1x Authentication, AAA and IP network design.
  • Experience deploying and maintaining network (LAN/WAN) KPI’s to support IP (internet) based converged voice and video applications.
  • Value creation by proactively analyze, monitor, keep track of issues and minimize project risks. Governance reporting to executive management.

Confidential

System Integrator

Responsibilities:

  • Customer System Integrator to integrate Ericsson SMS, MMS solution and content delivery systems for various Tier-2/3 operators in a hosted environment.
  • Integrate Ericsson’s solution for MMS, MMS alerts, content delivery systems including 3PP components in a multi-tenant hosted Environment.
  • Large scale database migration for MIEP WAP gateway, Verification of multiple tenants in a shared environment.

Confidential

Systems Engineer

Responsibilities:

  • HP-ODC consultant for LaserJet printer firmware development and verification.
  • Development and maintenance of printer LaserJet firmware components for direct PDF printing, familiar with protocol level knowledge of Postscript and PCL.
  • Port bug-fixes in C / C++ Programming Language across different product lines, verification and testing and produce test reports.
  • Provide build support and version control for SCM (Software Configuration management) using Rational ClearCase.

Confidential

Test Engineer

Responsibilities:

  • Automate test environment for a switching solution for broadband internet/intranet service providers.
  • Develop, implement and verify test cases and test scripts to include as part of automated test environments.
  • Analyze regression test results and provide inputs to the development team.
  • Maintain intranet web pages for the IOS group to display product related information.

We'd love your feedback!