Big Data Developer - Technical Analyst Resume
Bentonville, AR
SUMMARY:
- 10+ years of software development experience in the areas of requirement analysis,design, coding, Testing, Deployment and production support of big scale Java/J2ee and Big Data applications.
- Excellent skills in Analyzing, Designing, Planning, Implementing, unit testing and Deploying big scale Java/ J2EE and Big data applications.
- Good hands on experience (3 years) in Big data technology stack such as Apache Kafka, Spark, Hadoop, Hive, Cassandra, MapReduce, Oozie, Akka etc.
- Trained in AWS (EMR, Dynamo DB, S3) - created POCs for new project proposal.
- Got one year of experience with various AWS components like EMR, Dynamo DB, S3 etc. creating POCs and doing some external projects.
- Expertise in infrastructure creation using J2ee and Big data ingestion and analytics frame works.
- Hands on experience with DSE Cassandra, done POCs with Mongo DB, course completion certificate from MongoDB University.
- Good knowledge in MVC Frameworks like Struts, Spring, Spring Boot and micro services.
- Familiar in using various IDE’s like Eclipse, IBM RAD, IntelliJ
- Good working knowledge in Web services (REST/SOAP) and Enterprise Integration Framework Apache Camel, Apache Activiti with BPMN2.0.
- Familiar in using ORM tools like Hibernate and build tools like Apache ANT, MAVEN, SBT.
- Good working knowledge with versioning tools like SVN, GIT.
- Quick learner and flexibility to learn new technologies.
- Good perception of Object Oriented Programming (OOPs) and functional programming concepts.
- Good customer interaction abilities and team management skills.
TECHNICAL SKILLS:
Programming Languages: JAVA /J2EE/ Scala
J2EE Frameworks: Spring, Spring Boot, Struts, Apache Camel, Apache Activiti
Java API: Google GUICE, JPA, EhCache, JAXB, Hibernate (ORM),JAX-RS,JAX-WS, JDBC, Lift-Json etc
IDE’s: Eclipse, RAD, IntelliJ
Web Server: Apache Tomcat, JBoss, Jetty, TomEE
Build Tools: Apache ANT, Maven, SBT
Scripting languages: Perl script, Shell script
BigData FrameWorks: Apache Spark, Hortonworks HDP 2.4, Oozie, Hive, Apache Ignite, Apache Kafka, Map Reduce, Akka, Elastic search
Other Tools: Sonar Qube, Jenkins, HP ALM, SQL Workbench
Operating Systems: Windows 98/NT/XP/2000/Vista/7, Linux/Unix
Databases/NoSQL DB: Datastax Cassandra 4.8.7, DynamoDB, MSSQL Server, PostgreSQL, MySQL, Informix, Mongo DB
PROFESSIONAL EXPERIENCE:
Confidential, AR
Big Data Developer - Technical Analyst
Responsibilities:
- Understand requirements from the Program team and design and development of application.
- Analyze, Design and implement data pipeline to ingest the data to Kafka and other persistent layers.
- Analyze provider data from a flat file using Spark and transform as needed and publish to kafka.
- Streaming data from Kafka topics with Spark and do the transformations as per the requirement and save the data to Cassandra and Elastic search.
- Data modelling for Cassandra as per the requirement.
- Scheduling the spark jobs.
- Create CI/CD pipeline for build and deployment.
Environment: Scala 2.11., Spark 1.6.1, Spark 2.2, Cassandra, Kafka
Confidential, AR
Big Data Developer - Technical Analyst
Responsibilities:
- Spark streaming - streaming data, decrypted and decoded, parse the Kafka topics to hive tables.
- Functional audit - to audit the messages across different layers to make sure the data quality.
- Internal audit - Design and implement Spark jobs for checking the data in inventory data storage (Hive and Cassandra)
- Pushing the data to elastic search for visualization and analysis.
- Created Map reduce job (old design) for data archival to Hadoop.
- Create Oozie workflow for the spark jobs.
- Create CI/CD pipeline for the build and deployment
- Cassandra spark job - to do the aggregation on top of the real time data and saved it in other Cassandra tables also expose RESTful web services.
- Understanding requirements from the Program team and design and development of application.
Environment: Java 1.8, Java 1.7, Scala 2.10.6, Spark, Cassandra, Kafka, Storm, Rest Web services, Oozie.
Confidential, AR
BigData and Java-J2EE Developer
Responsibilities:
- Work with business to identify Big Data use cases.
- Design, develop, test, deploy of RESTful application.
- Design and Implement Data pipelines and components
- Performance tune Data transformation and Integration components to reduce the execution window
- Create CICD pipelines for the build and deployment of applications.
- Create schedulers for the Spark jobs.
Confidential
Senior Application Developer
Responsibilities:
- Followed Agile and Scrum ban approach for the development process
- Work involved in analysis and development of different requirements from customer for both feeders and Dispatcher.
- Involved in designing new architecture for feeder restructure and created, submitted the Use Case, prepared Technical Design documents for getting approval from design review council.
- Involved in design and development for restructure and conversion of feeder to new pub-sub architecture using camel components and spring.
- Work Involved in designing and implementing dispatcher changes for consuming new REST web service for new workload service.
- Extensively written CORE JAVA code in Dispatcher and Feeders consuming the REST Web services and IBM MQ.
- Worked on Camel implementation for feeders and multithreading with camel.
- Written JUnit test cases by using JMockit and JUnit
- Developed efficient SQL queries to interact with transportation DB.
- Worked on Shell/Perl scripts to automate the startup, performance and integration test scripts.
- Extensively worked on understanding complicated relationships between schema tables using ER diagrams and designing model and business classes.
Environment: JDK 1.6, Apache Camel, REST Web services, Spring, JMockit, Informix, JMS, IBM MQ, XML, XSD, Maven, JAXB, JDBC, Eclipse 4.x., Perl Script, Shell script, Apache Kafka
Confidential, Bentonville, AR
Senior Application Developer, Technical Analyst
Responsibilities:
- Followed Agile approach for the development process (TDD).
- Designed Technical Design documents of batch process, REST services for Driver Calendar, Office Calendar, POCs on Apache camel and Spring.
- Complete Ownership of the Artifacts, Analyses the load test report this includes scope for each release/delivery, Defect Management and Unit Testing Plan.
- Close follow up with project members on daily activities
- Leads development & validation activities for the assigned sub-modules.
- Development of batch jobs.
- Create implementation plan, Raise change control for installing the batch in different environment boxes which are in Linux and UNIX.
- Ensures that all aspects of testing, such as Unit Testing, System Testing and Regression Testing are completed before each release.
- Defect Management, PRODUCTION support.
- Review fixes and perform Code Quality review.
Environment: Java 1.6, Hibernate 3, WebSphere, Jetty, Informix, GUICE, RESTful Web service, Apache Camel, Easy Mock, JMockit
