We provide IT Staff Augmentation Services!

Sr. Bigdata Developper Resume

Dallas, TX

SUMMARY:

  • Having 11.6+ years of experience in the software development,analysis,implementation,architecting,design & application support.
  • Proficient in developing Bigdata applications using Hadoop Ecostack (HDFS,Map Reduce Pig, Hive, Impala, Sqoop, Hbase, Flume & Oozie,Elasticsearch,Cassandra, Spark,Kafka,Oozie,Kibbana)
  • Extensively involved in handling realtime time data ingestion using SparkStructuredStreaming,Kafka Scala application programming .
  • Having proficient experience in designing applications using Java, J2EE, Servlets, JSP, JDBC, EJB, XML, JMS,SQL,PL/SQL, STRUTS 1.2,HIBERNATE, EJB, JMS,Webserveices,SpringBoot & plug - in development etc.
  • Involve in large scale big data cluster-cluster migrations,AWS cloud platform activities and develop an executable solution which should exactly replicate source cluster in terms of data and application behaviour.
  • Strong experience in Data Ingestion, Data Analysis, Data modelling,data governance, Metadata management and data quality management using Scala,Java,Query Language with spark,hadoop stack api’s.
  • Functional experience include ProductLicecycleManagement,Telecom Domain,Healthcare & Manufacturing.
  • Profocient Implementation Exposure on different phases of Software Development life Cycle,Agile Methodologies & Practises.
  • Architecting bigdata applications in such a manner by considering available yarn-resources,cluster utilization,code re-usability perspective,performance, SLA’s, data modernization,security, constantly looks for costsaving opportunities by fine tuning or by moving open source systems and easily configurable. End to end delivery (development to production) ownership.
  • Functional experience include PLM,Telecom,Healthcare,Manufacturing,Travel & Tourism Domains.
  • Experienced in Develop build and deploy Feeds,Reports monitoring and alert notifications to track applications and workflows using Oozie, Unix shell scripting or Splunk Monitoring Tools.

SKILL:

Big Data Ecosystems:Hadoop,HDFS,Hbase,MapReduce,Hive,Pig,Oozie,Sqoop,Flume,Spark/ScalaKafka,MaprFS,MaprDB,MaprStreams,ElasticSearch,Cassandra,Kibana,AWS-EC2EMR,S3

Java/J2ee technologies: JAVA,SWT,J2EE,Servlets,JSP,Struts,XML,JSON,Hibernate,JMS,EJB,Webservices SpringBoot.

Database and tools: Oracle 10g, SQLDevelopper, TOAD, My SQL, MS-Access etc.

Servers: Weblogic Server, Apache Tomcat,JBoss,Linux

Tools: IntelliJ IDEA,Eclipse,MyEclipse,NetBeans,WebLogic Workshop, Microsoft Visual Studio,StylusStudio,Git,Jenkins,MapR,Hortonworks,Cloudera,iReport 3.1,BIRT Reporting,Teamcenter 8.x,9.x

PROFESSIONAL EXPERIENCE:

Confidential, Dallas, Tx

Sr. Bigdata Developper

Responsibilities:

  • As part of atom-dev team by following agile methodologies responsible for developing yarn-spark scalable lambda architecture application modules (realtime, batchprocess, dbreport, genericfeed, adhocprocess, elkoad),with audit and notification mechanism.
  • Responsible for upgradation existing applications development into latest api’s. During Hortonworks,Spark version upgrades and cluster patches. code reviews and provide feedback relative to technology-based best practices. Developing bright ideas, re-usable libraries, and coordinate estimation efforts.
  • Involve in large scale big data cluster migrations onto vbap,vcg(yahoo cluster),VZ AWS cloud platform activities and develop an executable solution which should exactly replicate source cluster in terms of data and application behaviour.
  • Develop Splunk dashboard monitoring Data Ingestion,Factdata,Report,Feed, ELK Load and Alert notifications to track application workflows.
  • Responsible for production, support offshore team deployment activities.
  • Developing, enhancing applications, and existing manual build applications to CICD pipelines process with DevOps automation and tool Jenkins, SonarQube, Jira, GIT, Maven and SBT.
  • Monitor splunk dashboards and email notification, oozie workflow jobs regularly, applications in the warranty period and provide necessary operational support
  • Observe Ingestion,Batchload,Report,Feeds applications and check progress on vbap resource manager web-ui regularly resolve on time without exceeding SLA in case of cluster, application failures.
  • Recreate Production issues in SIT environment and develop for fix.
  • Responsible for ingestion,consumption,maintenance, production bug fixes, data cleansing, troubleshooting production job failures with workarounds or re-running jobs if necessary.
  • Gather production parallel test results. Validate by comparing existing live systems if exists or verify system behavior, before going live.

Environment: JDK-1.8.0,Hortonworks-HDP-2.6.4,Spark-2.2.0,Scala-2.11,Kafka-0.10.1,Hbase-1.1.2,Phoenix- 4.17.0,Oozie-4.2.0,Sqoop-1.4.6,Pig-0.16,ElasticSearch-5.5.1,Kibana,Cassandra-2.0.10,GitLab,Jenkins,SonarQube 7.8,Git,Maven-1.8,SBT-1.3.2,AWS-EMR,EC2,S3,Linux

Confidential, Phoenix

Technical Lead Consultant

Responsibilities:

  • Responsible for programing main modules like (EventCapturing,Loading,Extraction) of Raw Data Ingestion(RDI) product,requirements gathering,architect and design,unit testing,troubleshooting and documenting big data-oriented software applications and follow Agile (SCRUM) methodology. End to end delivery (development to production) ownership.
  • Developped EDE Domain Ingester application Data domain ingestion pipelines using DCT & DIF tool Data onto the EDL into raw & sanitized zones.
  • Responsible for developing EDE Domain Processor for data transformations/derivations/aggregations for a domain. The final processing output is placed in the conformed or curated zone of the Data Domain
  • As part of Modernization program data across many sources emitting events built an data river using maprstreams & kafka api programs.
  • Migrated existing old clusters applications from DSPRD,EDL1.5 clusters to EDL2.0 clusters using mirroring, distcp,hbase & mapr copy table techniques.
  • Attending meetings with Business stakeholders’ team regularly to gather requirement of applications, data requirements and validating the feasibility from technical perspective.
  • Perform design and code fixes necessary to address defects observed in QA,SIT,PROD environment.
  • Gather production parallel test results. Validate by comparing existing live systems if exists or verify system behavior, before going live.
  • Responsible for ingestion,consumption,maintenance, production bug fixes, data cleansing, troubleshooting production job failures with workarounds or re-running jobs if necessary.
  • Monitor splunk dashboards and email notification, oozie workflow jobs regularly, applications in the warranty period and provide necessary operational support.
  • Involved EACO call to identify production issue analysis & provided workaround to mitigate SLA’s.
  • Responsible for production, support offshore team deployment activities.
  • Developing, enhancing applications, and existing manual build applications to CICD pipelines process with DevOps automation and tool Jenkins, Jira, GIT, Maven and SBT.

Environment: HadoopEcosystem: MapR 6.0,HDFS, Yarn, MapReduce, Hive, Pig, Flume, Sqoop, Oozie, Hbase, Kafka, MaprStreams,Hbase, MapRDB, Spark/Scala,Linux

Confidential

Project Lead

Responsibilities:

  • Responsible for people Management, including goal setting and providing performance feedback
  • Work with IT Service Management and other groups and make sure that all events, incidents and problems are resolved as per the SLA
  • Monitor running applications and provide guidance for improving DB performance for developers.
  • Assist development team in identifying the root cause of slow performing jobs / queries.
  • Providing Technical Architecture Selections of tools and technologies
  • Responsible for developing,genarting,weekly& daily feeds & comparing the results with RDS system & sharing the Bigdata system o/p to downstreams.
  • Migrated RDS Datastage Jobs into Bigdata using spark/scala. Involved in Developping streaming applications.
  • Migrating complex Datastage/mainframeuse cases into spark sql& streaming application
  • Worked on Kafka Consumer/Producers
  • Designing the dataflow in Tentant& responsible designing table structucture for those
  • Understanding the data, customer requirements and business use case.
  • Regularly /Directly co-oridinated with offshore team/designers to develop the deliverables
  • Involving in the all the technical architecture call & providing solutions to business use cases and develop methodologies.
  • Writing ETL jobs using Spark/Scala,Pig/MapReduce/Hbase.
  • Contribution/Involving Devops activities.
  • Followed Agile methodologies & actively involved in scrum meetings.
  • Actively Involved in SSMO bds activities by addressing the issues with exceeding SLA.

Environment: HadoopEcosystem: HDFS, Yarn, MapReduce, Hive, Pig, Flume, Sqoop, Oozie, Hbase, Kafka, Hbase, MapR, Spark/Scala,Linux

Confidential

Software Product Consultant

Responsibilities:

  • Involved in several client meetings to understand the problem statements and other requirements
  • Developed Map Reduce program’s using java.
  • Experienced in loading data to hive partitions and creating Buckets in Hive.
  • Data Migration from source systems to HDFS using Sqoop
  • Written MapReduce/Pig programs for ETL and developed Customized UDF’s in java.

Environment: HadoopEcosystem: HDFS, Yarn, MapReduce, Hive, Pig, Flume, Sqoop, Oozie, Hbase,Cloudera,Hortonworks

Confidential

Software Product Consultant

Responsibilities:

  • Involved in RAC side Development,Defect analysis & fixing.
  • Handled Mapping designer task,generated mapped xml for unit testing & delivered accordingly.
  • Invoved in Deployment activities .
  • Testcase needs to be executed, coordinated onsite & offshore team.
  • Helping offshore Team to Reproduce the defects & sharing functional knowledge if necessary.

Environment: Technologies:Java/J2ee,Eclipse3.8,MappingDesigner,XML,Altova XML Spy,Weblogic 10.1, Oracle 11g, Microsoft Visual Studio9x

Confidential

Technical Associate

Responsibilities:

  • Development and debug application modules for MLP, MFOUT,MRSpecs,MDA
  • Prepare Test specifications and carry out unit and system testing of code for all the middle-ware application modules.
  • Owns estimation of new requirements of MLP,MFOUT,MRSpecs,MDA
  • Interacted with designers to understand the functional requirements & impact on the system.
  • Responsible for all stages in the software development process & technical issues across the entire system and delivered the optimizing solution.
  • Enables dev team support for CIT/IVVT Team & Pipe Cleaning activities
  • Worked as a team member in supporting in-life faults
  • Resolve issues within stipulated time, without bridging the SLA.
  • Have worked with different teams (Designers, Other Component teams) to find quicker & quality resolution for faults.
  • Helped in project management & maintaining report for the faults worked on.
  • Handles weekly status calls with onsite team for reviewing of the outstanding faults.
  • Calculated FPA sheets & delivery note preparation to support Release management
  • Building up a very strong customer support and communication structure

Environment: Java,J2ee, EJB JMS, MQ-Series, Webservices, Oracle 9i, TestNG, JUnit,Weblogic 8.1

Confidential

Associte Consultant

Responsibilities:

  • Developed all types of Customizations (Reports, Forms, Windows, and Callouts).
  • Developed customized Windows, Forms, Reports, s (all types), Developed report using iReport& integrated as well.
  • Generated Reports for better understanding of Organizations with in the unit like (product Search, Stock History, Warehouse wise report, Product & Warehouse Report.
  • Generated BOM (Bill of Material) Shortage (Entire BOM Structure) Report.
  • Implemented Goods Movement t/f concept for entire BOM Structure (entire BOM Products).
  • Involved in all phases of the project like transforming business requirement into development mode
  • Improved the performance of the application by optimizing the functionality & with technical improvements
  • Provided necessary inputs through demos for client to meet their functional requirements

Environment: Java, J2ee, Struts 1.2,Spring Hibernate, iReport 3.1, Tomcat 6.0,Oracle 10g

Confidential

Associte Consultant

Responsibilities:

  • Developed Main STB Module, Reports & integrated to the application.
  • Involving in development of integrating module like STB Module & payment section.
  • Preparing UFEs (User Familiarization Experience) and providing the demos to the end user and the CE and the solution designers

Environment: Java, J2ee, Struts 1.2, Hibernate, Oracle 9i,JBoss

Hire Now