Big Data Consultant Resume
PROFESSIONAL SUMMARY:
- 13 years of software development experience with strong emphasis on Design, Development, Implementation, Deployment and Support of Software Applications.
- 3 years of extensive experience in Big Data and Big Data Analysis tools.
- In depth understanding of Hadoop Architecture and its components HDFS, Map Reduce and YARN.
- Hands on experience with Big Data Analysis tools like Hadoop, HDFS, Map Reduce, Hive, Kafka, HBase, Elastic Search, Spark Core, Spark SQL, Spark Streaming.
- Expertise in working with Spark Batch and Streaming jobs for faster analyzing and processing of data using Scala.
- Expertise in writing Hadoop Jobs for analyzing data using HiveQL and Pig scripts.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
- Worked with NoSQL databases like HBase and Cassandra.
- Experience in working with Object Oriented implementation technologies Java/J2EE and C++.
- Experience in development and integration of Web applications using Servlets, JSP, Spring Framework, Web Services, XML, HTML, CSS and Java Script.
- Implemented applications using Spring framework and Spring Cloud Data Flow in Pivotal Cloud Foundry.
- Experience in working with Elasticsearch search engine.
- Experience in working with relational databases like Oracle, MySQL and DB2.
- Expertise in UNIX technologies, Shell Scripts, Regular expressions, SED, AWK, etc.
- Experience in working with C, Perl and EPM languages.
- Strong experience and understanding of software development methodologies such as Agile Methodology and Waterfall Model
- Expertise in SingleView Convergent Billing and Amdocs Charging System products in Telecom Billing Solutions.
- Handled several techno-functional responsibilities including estimates, identifying functional and technical gaps, requirements gathering, designing solutions, development, performance improvement and production support.
- Experience in handling performance issues of production systems.
TECHNICAL SKILLS:
Hadoop Ecosystem & Tools: Hadoop, HDFS, Map Reduce, Spark, Hive, Redis, HBase, CassandraApache Phoenix, ElasticSearch, Pig, Oozie, Sqoop, Zookeeper
Hadoop Distributions: Hortonworks and Cloudera Hadoop distributions
Languages: Scala, Java SE, J2EE, C++, C, Perl, SQL/PLSQL
Web Technologies: JSP, Servlets, JDBC, MVC, Spring, XML, Java Script, HTML, CSS
Messaging: JMS, Apache Kafka, RabbitMQ
Frameworks: Spring, SCDF
Cloud Platforms: Pivotal Cloud Foundry
Scripting: Unix Shell Scripting, Perl
RDBMS: ORACLE, MySQL, TeraData
Operating Systems: Unix, Linux, Windows
Web/Application Servers: Apache Tomcat, WebSphere Application Server
IDE / Tools: Eclipse, IntelliJ IDEA, Maven, SBT, SVN, GitHub, Jenkins, ToadSQL Developer, Remedy, JIRA, SQL Workbench, CVS, SVeet Cucumber
Methodologies: Agile Methodology (Scrum), Waterfall Model
PROFESSIONAL EXPERIENCE:
Confidential
Big Data Consultant
Responsibilities:
- Design and implement customer requirements of Usage Based Insurance program module.
- Interaction with the Client and other vendors for integrating solution with other interfaces.
- Performed data integration, transformation, reduction by developing Spark jobs in Scala.
- Implemented Spark Streaming jobs to process PAS events in real time and update them in HBase and Elastic Search databases.
- Implemented Spark Batch processing jobs to process the events based on business rules and the information from other vendors.
- Used Scala programming language to develop Spark streaming and batch jobs.
- Worked on Kafka distributed messaging system for streaming data processing.
- Involved in creating indexes and fetching data from Elastic Search and mapping it with Customer service portal.
- Processed data from HBase tables using Apache Phoenix and Java API.
- Implemented Spring boot applications and SCDF apps for Confidential mobile project.
- Created streams for event processing using SCDF data flow and deployed in Pivotal Cloud Foundry.
- Involved in implementation of Customer Service Portal using Java.
- Review deliverables for technical completeness and to ensure the performance, operability, maintainability and scalability of the proposed technical solution.
Environment: Spark, Scala, Java, J2EE, Kafka, Apache HBase, Apache Phoenix, Hadoop, Hive, Oozie, Zookeeper, ElasticSearch, Web Services, Spring Framework, SCDF, PCF, Tomcat, SOA, SQL, Shell Scripting, SQL Workbench, Kibana, Splunk, Maven, SBT, SVN, GitHub, IntelliJ IDEA, Eclipse, Jenkins, UNIX, Windows
Confidential
Big Data Developer
Responsibilities:
- Performed data integration, transformation, reduction by developing Spark jobs in Scala
- Used Spark Streaming jobs to consume and load data into Cassandra database tables
- Written Spark Batch processing jobs to load the data from SQL database to Cassandra database
- Written Spark SQL scripts to process structured data by using Spark SQL Schema RDDs, DataFrames, Transformations, Actions, and Joins etc.
- Used Spark Cassandra SQL Context to perform Cassandra table joins.
- We used Scala programming language to develop Spark core, Spark Streaming and Spark SQL jobs.
- Worked with both Spark SQL Context and Spark Hive SQL Context.
- Worked on Kafka distributed messaging system for streaming data processing.
- Involved in Creating, Altering Cassandra database components such as Keyspaces, Column families, Indexes.
Environment: Spark, Scala, Java, J2EE, Kafka, Apache Cassandra, Hadoop & YARN, RDBMS, SQL, PL/SQL, C#, Web Services, Tomcat, JDBC, Oozie, Shell Scripting, SVN, Maven, IntelliJ IDEA, UNIX, Windows, Toad, Eclipse
Confidential
Development Consultant Specialist
Responsibilities:
- Design Customer Requirements for end to end modules of SingleView product
- Interaction with the Client and other vendors for requirements gathering
- Involved in scoping for new functionality
- Development of Customer Requirements
- Leading development team and responsible for the delivery of CR
- Playing Scrum Master role as part of Agile methodology
- Lead developer while upgrading MTNSA solution to SV9 core version
- Provided support Confidential Onsite for major release roll out
- Estimation of development efforts
- Mentoring and guiding other team members
- Involved in implementation of Product Catalogue, Normalisation, Rating, Discount, Billing, Invoicing, Migrations, APIs and Reports
- Resolved various performance issues in production system
- Migrated customers from other systems to SingleView
- We used Sqoop for importing data from RDBMS to Hadoop and exporting data from Hadoop to RDBMS.
- Written Hive Jobs to process the data as per business requirements to analyze further by data analytics.
Environment: SingleView V9.0, UNIX, Windows, Java, J2EE, Perl, C++, C, EPM, Web Services, Tuxedo, Oracle, Shell scripting, Hadoop, Hive, Spark, Scala, Toad, Eclipse, SOAP, Share Point, Quality Centre, Remedy, Jira, Relgen, CVS, ECCS, SVeet Cucumber, Confluence
Confidential
Senior Development Consultant
Responsibilities:
- Development of Customer Requirements
- Interaction with the Client for requirements gathering
- Involved in Design, Coding, Unit Testing and Support phases
- Estimation of development efforts
- Mentoring and guiding other team members
- Provided support Confidential Onsite for major release roll out
Environment: SingleView V6.0, UNIX, Windows, Java, Perl, C++, C, EPM, Tuxedo, Oracle, Shell scripting, Toad, Eclipse, SOAP, Share Point, Quality Centre, Relgen, CVS, ECCS
Confidential
Senior Subject Matter Expert
Responsibilities:
- Lead developer and responsible for handling Amdocs Rater module
- Interaction with the client for requirements gathering
- Interaction with other module (PC, A&F, Invoicing, A&R) leaders to understand the flow
- Involved in Design, Coding, Unit Testing and Support phases
- Estimation of development efforts
- Mentoring and guiding other team members
- Provided support Confidential Onsite for the initial phase of every major release roll out
Environment: Amdocs Charging Product (Enabler), UNIX, Windows, Java, C++, C, Oracle, Shell scripting, Toad, Eclipse, Quality Centre, CVS
Confidential
Technical Associate (Developer)
Responsibilities:
- Developing new classes depending on the change of requirements
- Involved in Coding, Test Case preparation and Unit Testing
- Estimation of development efforts
- Defect fixing in Support phases
Environment: C++, MFC, Oracle, Windows NT/XP, CSS Extra Client (Mainframe), Toad, Quality Centre