To excel and grow in the sphere of BIG DATA by working with such a growing organization, which gives me better opportunity to prove my skills & rewarded to fulfill my dreams as well as of organization.
- 7.5 years of experience in design, analysis and development of Enterprise grade Applications using Java and Big data Hadoop technologies.
- Experience in Core Java and Hadoop ecosystem(MapReduce MR1, YARN, HDFS, Hive, Impala, PIG, HBase, Oozie, Sqoop, Flume)
- Successfully delivered couple of initiatives (Implementation & development) on Big Data Analytics and large data processing using Hadoop ecosystem.
- Good exposure to entire Software Development Life Cycle phases (Feasibility, System studies, Design, Coding, Data migration, Testing and Implementation & Maintenance)
- Importing and Exporting data using Sqoop from SQL server and vice - versa.
- Extensively worked on HIVE, Sqoop and Java Map Reduce. Performed tuning of HIVE queries and Java Map Reduce programs for scalability and faster execution.
- Performed integration with custom web applications and Web services.
- Experienced in handling the production support of applications with large user base and production data.
- Excellent communication, training and people management skills.
- Achievements: Received Bravo award for achieving best solution delivery from Confidential .
Big Data Technologies: Hadoop - Map Reduce, Hive, Impala, Pig, Sqoop, Flume, Oozie, Hue, HDFS. Spark; No SQL- HBase; MySQL
Programming Language: Java, Scala, Python
Other Tools/Utilities: Eclipse, GIT, Putty, WinSCP
Operating Systems: Linux, Windows
Senior Big Data Consultant
- Extensively worked with business team to understand requirement and transformed the business and functional requirements into a high level design documents
- Provided various design recommendations during construction phase of the project
- Created impact analysis document to find out list of reporting applications affected due source system migration
- Created various Sqoop Import scripts to load the data from SQL server, SAP database to Hadoop and Sqoop Export scripts to load the data from Hadoop to Teradata
- Developed Map Reduce programs to parse the raw data, populate staging tables.
- Migrated the ETL transformation logic from source systems to Hive.
- Worked with data modeling team to model target tables in Teradata database
- Prepared various test cases to validate the data loaded in Hadoop.
- Prepared set of Maestro hourly/daily/weekly jobs to schedule the data extraction, loading and purging.
Technologies: Hadoop 2.0 - HDFS, Map Reduce, HIVE, Sqoop, Oozie, SQL Assistant, Hue, Linux
- Involved in the Data Migration from flat file to SQL server.
- Involved into Special Pricing Module end to end development
- Developed a fulfillment solution (including product catalogue, Call report, and pipeline) to deliver a robust, integrated and automated flow through provisioning capability.
- Developed a customer relationship management solution to provide systematic support for the Investment Counselors service desk processes.
- Preparation of technical design document
- Involved in code review process.
- Interact with users and stakeholders to understand the existing system, pain areas with the current environment.
- Involved in Installation to Deployment and Post Go- Live Support.
- Setup security profiles and all users
- Created many views to allow user to see exactly the information they wanted about the Leads
Technologies:Java, Eclipse, SQL Server 2008, XML, AJAX, HTML, CSS, GIT
- Used N-tier architecture for Presentation layer, Service layer, Business and Data Access Layers and were coded using Java.
- Developed Fault Contracts with custom exception messages and fault codes.
- Involved dynamic MSS360 data model configuration management.
- Developed Dealer Survey aspx page as custom MSS360 web page.
Software: Java, Eclipse, Soap-UI, Oracle 9i, SQL, UNIX, Web services