We provide IT Staff Augmentation Services!

Pl/sql, Hadoop, Java Developer Resume

4.00/5 (Submit Your Rating)

San Jose, CA

PROFESSIONAL SUMMARY:

  • Extensive Experience in Oracle PL/SQL, MongoDB development and Maintenance, designing and developing Back End Packages, Triggers, views, materialized view, Procedure, Functions, Cursors, collections, plsql objects, and DBMS scheduler jobs. Also extensively worked for more than 2.5 years in MongoDB and have expertise in writing complex MongoDB queries, design of collections, queries using aggregate frameworks and writing data updates/migration scripts, with good knowledge of other technologies like Infomatica, Java, Servlets, JSP, XML Schema, JSON, UNIX, Golden gate, Tidal Job, HTML and Agile Methodology .
  • Good hands on experience on Hadoop, Hive, hue, Scoop, UNIX, Sheel Scripting and Java UDF, Big Data platform.
  • Good hand on experience on Big data platform of Hadoop, hive queries, Scoop jobs, shell scripting and Informatica
  • Writing Map reduce and Java UDF for Hadoop development
  • Expertise in Development, Integration, Installation, Implementation, Maintenance, Testing, and Debugging of various application in Oracle PL/SQL.
  • Good knowledge of RDMS concepts
  • Experience in migration project from Oracle to MongoDB
  • Expertise in writing complex Mongo queries, using aggregation frame work and MapReduce
  • Expertize using Rally Scrum Agile Software development methodologies
  • Around 5 years of US experience and client interaction on day to day basis
  • Experience in leading the team for Oracle to MongoDB migration and solving problems
  • JSON compare project done in Java which connects to 2 mongo data base instance to compare a collection, gets the difference and prepare MongoDB scripts for the delta for deployment
  • Experience and SME in Ordering ebusiness tool
  • Experiences with knowledge of Golden gate technology to set up golden gate between multiple data bases
  • Experiences with knowledge of Oracle XML DB to set up xml notifications
  • Experienced in providing IT delivery services
  • Expertise in various cross functional analysis and its impact to business
  • Expertise in gathering and analyzing the requirements of users and translating them into technical definitions and designs
  • High customer interaction on a day to day basis
  • Experience in handling Remedy Management System used for tracking incidents and problems tickets
  • Closely involved in various fiscal period closure planning and system readiness
  • Experience in supporting SOX controls and various internal and external audits
  • Hands on expertise on Informatica power center, Business Objects XI
  • Involved in various release management activities from creation of scripts to deployment cycle
  • Experience in debug and implement performance improvements by adding hints working with performance team and pulling a trace/TkProf of the session and analyzing it
  • Experienced working in Agile and Waterfall environments

TECHNICAL SKILLS:

Technologies: MapR Hadoop, Hive, Shell Scripting, Unix, Scoop, SQL, Oracle PL/SQL, MongoDB, Informatica, Java, Servlets, JSP, HTML, Business Objects Reporting, UNIX, XML, JSON, Golden gate, GIT.

Databases: ORACLE 9i,Oracle 10g,Oracle 11i, Oracle 12c, MongoDB 3.4, HDFS, Hadoop

Development Tools: Hue, TOAD, PLSQL Developer, Robomongo, MongoChef, Mongo Compass Testing Tool, FireBug Debugging, Eclipse IDE, SVN version control, Notepad++, EditPlus, AppDB, Kintana, Quatily Management, Rally.

PROFESSIONAL EXPERIENCE:

Confidential, San Jose, CA

PL/SQL, Hadoop, Java Developer

Responsibilities:

  • Designing and developing Hadoop system for Confidential ’s Partner Metrics central application using Hive, Hue, MapReduce, Java UDF, Scoop, Shell scripts, Tidal, Oracle PLSQL and Informatica
  • Estimate cost and efforts on the new business requirement mentioned in Confidential ’s Partner Metrics central application Rally backlog
  • Grooming meetings with business analysts and end - users to understand PMC business requirements
  • Performing impact analysis for any new change request and gap analysis of As-Is system and To-Be System mentioned in the Rally backlog
  • Develop ETL (Extraction Transformation and Load) jobs in Informatica to pull the data from data warehouse of EDW Teradata and POS systems in oracle
  • Developing scoop jobs to load the data from oracle to Hadoop Datalake and vice versa
  • Write Hadoop Hive queries to run MapReduce jobs fetch the required data from Hadoop datalake (Big data system) and implement business rules
  • Creating and enhancing the Java UDF and Map Reduce jobs to process the datalake warehouse’s Big data on hadoop clusters as per Business criteria’s
  • Creating Shell scripts to implement business flow for Hive queries and Java UDF implementation and run automated jobs that runs the scripts on distributed Hadoop clusters
  • Create Tidal jobs for automating the Confidential ’s Partner Metrics calculations weekly
  • Maintaining project status through Rally, User Stories (defining requirements), Tasks, defects, daily stand up calls, Sprint planning, Retrospective meetings
  • Collaborate with cross functional teams, architects and project managers to identify and fill the gaps for the EDW ( Enterprise Data Warehouse) boundary meetings
  • Coordinate with the stakeholder, onsite offshore team and cross functional team for User acceptance testing and business acceptance testing, unit testing and in case of production issues on webex conference calls and in person meeting held
  • Root cause analysis for existing production issues, and providing long term fixes for the same mentioned in Rally defects or production tickets
  • Release managements and ensuring timely code drop through Confidential internal release management tool AppDB

Tools: Hadoop MAP R, Hive, Hue, Scoop, Java, Eclipse, Git, UNIX, Shell script, Oracle PLSQL, Toad 6.5, SSH client, Informatica, Tidal Job, AppDB

Confidential, San Jose, CA

Sr. PL/SQL, MongoDB Developer

Responsibilities:

  • Designing and developing MongoDB system for Order
  • Understanding business requirements functionally for the new developments
  • Creating technical documentations depending on the function requirements
  • Analyzing the new requirements and providing the Effort estimation for the development
  • Developing the code as per the function requirements
  • Ensuring timely code drop
  • Working with quality analysts for fixing bugs in the system
  • Working with golden gate team to set up golden gate for particular requirements
  • Working with BEMs team for getting the notifications set up done
  • Worked on a POC and implementations on notifications XML generation
  • Coordinating with cross functional teams for development impacts
  • Release managements

Environment: Oracle PLSQL, MongoDB, Scheduler Jobs, Oracle 11g, Toad 6.5., SSH client, HP Kintana 10, PVCS, CVS, Unix, XML DB,UNIX, Golden Gate, Robomongo, Mongo chef,Tidal Job.

Confidential, San Jose, CA

IT Enhancement Engineer / Subject Matter Expert (SME)

Responsibilities:

  • Deliver IT Services and product support for performance metrics central tool
  • Worked on enhancements related to the new requirements and bug fixes
  • Worked on optimization of queries related to performance tuning.
  • Worked on data & system issues pertaining to data quality, availability and upstream refresh jobs
  • Weekly metrics files generation before publication to end user and validation of data
  • Investigation on metrics anomalies and correction
  • Setup of new program and performance metrics
  • Support data setup for business rule creations
  • Business communication on data, metrics and system availability
  • Validation of data and functional clarification to end user community
  • System enhancement for case reduction and long term solutions
  • Interaction with business operations on a regular basis
  • Worked with performance team to improve the system performance and system stability
  • Worked on $U jobs to set correct dependencies and set correct scheduling
  • Worked on adhoc reports required by business users to validate the system reports, and planning
  • Runtime job monitoring and fixing job related issues by working with DBA and performance team
  • System Validation after infra CR’s
  • JVM restarts and logs verification in case of application fluctuation

Environment: SQL, UNIX, SSH script, Teradata (hands on), Informatica Power Center 9.1, Oracle 11g, Toad 6.5. SAP Business Objects XI R3, SSH client, Dollar Universe, HP Kintana 10, PVCS

Confidential, San Jose, CA

IT Module lead

Responsibilities:

  • Deliver IT Services and operational support for Confidential partner program platform group of tools
  • Support production system to handle reported issues and responsible for routine maintenance
  • Worked on data & system issues pertaining to data refresh and upstream issues
  • Troubleshoot various partner related issues pertaining to their enrollments in different programs, rebate and Confidential revenue impacting issues
  • Handle new program on-boarding request and feasibility analysis
  • Migration of various third party programs in Partner program enrollment tool
  • Analysis on new requirement and writing business rules to support that
  • Worked on creating base lines reports to support business data validations
  • Running program payment runs and support partner payment queries
  • Validation of program rebate calculation and tie outs
  • Communications to business operations and impacted downstream in case of any anomaly in the system
  • Worked closely with Program managers & finance stakeholders to correctly identify their requirements and ensure that all their requirements are met on time
  • Ensuring release readiness and post production issues handling
  • Support S-Ox and PWC audits
  • Developed help docs for PPE L2 support for frequently asked business cases
  • System Validation after infra CR’s
  • JVM restarts and logs verification in case of application fluctuation

Environment: SQL, PLSQL, Business Object, Informatica, Oracle 10g, Toad 6.5. MS Office, UNIX, SAP Business Objects, SSH client, Dollar Universe, HP Kintana 10, PVCS

Confidential

IT Support Engineer

Responsibilities:

  • Conducted meetings with management and user community to understand their requirements and convert Business requirements into technical specifications.
  • Gathering requirements from users and developed functionalities as per the business requirements.
  • Involved in creating new reports for business users and cross functional team
  • Working with Partner relationship team for process optimization and case reduction
  • Developed help docs for L2 support for frequently asked business cases
  • Worked with cross functional team to understand the discrepancies in PDB and CR data base and worked on planning and fixing those discrepancies
  • Worked on Data quality between PDB and CR
  • Worked on day to day issues related to Partner registration, attributes and contact management
  • Worked on PDB hierarchy changes
  • System Validation after infra CR’s
  • JVM restarts and logs verification in case of application fluctuation

Environment: SQL, PLSQL, Business Object, TIBCO, Oracle 10g, Toad 6.5. MS Office, UNIX, SAP Business Objects, SSH client, Dollar Universe, HP Kintana 10, PVCS

Confidential

IT Support Engineer

Responsibilities:

  • Post the batches in oracle form application
  • Generate daily report of the daily posting by refreshing the BO universe
  • Sending the reports to different theaters admins and in case of any anomaly working with support engineers of WIPS DCA and fixing the issue
  • Coordinating with stake holders and WIPS DCA business team in case of report time lines or any P1/ P2 in the upstream system

Environment: SQL, Business Object

Confidential

Trainee

Responsibilities:

  • Learning new technologies
  • Hands on new technologies learn
  • Pass the midterm and end term exams and exceed 70% passing marks.
  • Learn TCS specific standards and code of conducts
  • Project delivery at the end term
  • Inputs to find the shortest path between destinations were provided by JAVA interface and ORACLE DB was used to store the destination data set.
  • Report of the shortest path and its summary generated in UNIX.

Environment: Oracle, SQL, PLSQL, JAVA, UNIX

We'd love your feedback!