Big Data Developer Resume
5.00/5 (Submit Your Rating)
TECHNICAL SKILLS
- Analytics Platform
- Apache Hadoop HDFS
- HDFS
- Apache Hadoop Impala
- Impala
- Apache Hadoop Oozie
- Oozie
- Apache Hadoop Sqoop
- Sqoop
- BIG Data Analytics
- Data Analytics
- Data Integration
- ETL
- Hadoop
- Kafka
- Machine Learning
- MAP Reduce
- Replication
- Splunk
- Teradata
- .Net
- Application Development
- Avro
- Continuous Integration
- Git
- Hive
- HTML
- JavaScript
- JSON
- Python
- Pyspark
- Scripting
- Data Analysis
- Database Systems
- JDBC
- Oracle
- SQL
- Stored Procedures
- Amazon Elastic Compute Cloud
- EC2
- Amazon Web Services
- AWS
- AWS Cloudformation
- Cloudformation
- Eclipse
- J2EE
- Java
- Hibernate
- Java Technologies
- Spring
- Jquery
- JSP
- Servlets
- Struts
- EMR
- JOB Scheduling
- SCRUM
- Version Control
- Data Quality
- JIRA
- Junit
- Tableau Software
- Tableau
- MVC
- Rational
- SDLC
- CSS
- Intranet
- Web Services
- Angularjs
- Model View Controller
- Restful Web Services
- Rest Web Services
- AWK
- Linux
- Unix/Linux
- Deployment
- Maven
- Model - View-Controller
- User Interface
- UI
- Random Forest
- Scala
- MATLAB
- Business Requirements
- Requirements Gathering
- Work Flow
- Workflow
- Integrator
- Integration
- Architecture
- Quality Checks
- Test Scripts
- Functional Testing
- Marketing Analysis
- EAP
- Instrumentation
- Best Practices
- Design Review
- Documentation
- Product Documentation
- Logging
- Verizon
- Scheduling
PROFESSIONAL EXPERIENCE
Confidential
Big Data Developer
Responsibilities:
- Implemented work flows to parse the raw data and generate refined data by leveraging Hive, Impala, Sqoop, Hue, EMR and EC2, Atana and Attunity Replicate
- Developed Scripts and Batch Jobs to schedule various functional modules.
- Developing sustainable solution to process gigantic amounts of online market tick data - Current data with aid of Spark, Impala, Hive, Oozie, Python
- Migrated on-premises Hadoop data and queries to AWS cloud infrastructure, ensuring security
- Worked on automation of Importing data from Oracle into HDFS and S3 using Sqoop and Oozie for workflow orchestration and created hive tables according to business need.
- Made changes to the existing infra to provision AWS query clusters to bridge gap between S3 data and tableau - leveraged Atana to perform data analysis on top of S3
- Shared the responsibility of managing, monitoring and creating tasks for Attunity Replicate.
- Wrote SQL Scripts to enable replication for tables.
- Supported the failed jobs in production
- Created Splunk alerts and analyzed AWS cluster logs for cluster related issues.
- Adapted to behavior driven development (BDD) and wrote cucumber-java test scripts
- Leveraged AWS CloudFormation, Lambda Service, Bamboo and Bitbucket to maintain infrastructure as code, made changes to EMR clusters - Long-running or Ephemeral
- Leveraged close integration of Jira, Bitbucket and Bamboo to ensure continuous integration and delivery
- Involved in Business Requirements clarification in coordination with Business Analysts
- Used Scrum Agile methodology for iterative development
Confidential
Hadoop Developer
Responsibilities:
- Ingested data from Teradata to bigdata with the help of Talend framework
- Automated ETL process to move data from landing zone to Confidential by integrating Oozie workflows
- Experience transforming, cleansing and loading of data across the layers of EAP using HiveQL based on business requirement
- Performed data quality checks of testing the raw data using custom DQ framework and Data Quality HIVE queries
- Generated alerts based on the scenario logic specified by the business
- Experience with workflow automation using Oozie
- Expertise in version control, build creation and promotion for release management using Rational Team Concert
- Created run books for multiple applications developed across Enterprise Analytics Platform as part of global Transaction Monitoring program
Confidential
Software Developer
Responsibilities:
- Developed and made enhancements to the web application - Royalty Management System based on Spring MVC Framework and log4j for logging the errors
- Created interactive web pages and made enhancements by the usage of HTML, CSS, AngularJS and jQuery. Created AngularJS controllers and ExpressJS routes.
- Called rest web services through AngularJS services to get JSON object and modified the response object to display in UI.
- Used Junit and Easy Mock framework for test driven development approach, Git as source control, and automated build process through Maven build framework.
- Involved in all phases of SDLC such as requirements gathering, modeling, analysis, design and development.
Confidential
Software Developer
Responsibilities:
- Designed and developed various modules of the application using Struts MVC Architecture with servlets, JSP, JDBC, HTML, CSS, JavaScript
- Created Database tables, stored procedures to fetch and store data in Oracle database
- Performed unit and functional testing of application before release of code to client
- Involved in design, SQL review, table design review sessions with application designerVerizon
Confidential
Software Developer
Responsibilities:
- .Net trained for application development and was provided access to application for analysis