We provide IT Staff Augmentation Services!

Hadoop/cloud Engineer Resume

2.00 Rating

Chandler, AZ


  • Vast experience in IT industry in Architecting, Analysis, design, development, testing, integration, administration and support of diversified application areas using various technologies.
  • Extensively worked with various forms of data/datasets & databases, involving transformations, loads, reporting, analysis, SQL etc.
  • A great coder: Extensively worked on various web, scripting, desktop, database, distributed computing programming languages.
  • Experience with installing, configuring and troubleshooting various Hadoop ecosystem components like Map Reduce, HDFS, Hive, Sqoop, Flume, Zookeeper, etc
  • Experience in Python scripts, Python Flask REST API.
  • Experience with distributed computing framework Spark using Scala & Pyspark
  • Good understanding of NoSQL databases like Hive, Hbase, Cassandra, MongoDB.
  • Good understanding of ETL, Datawarehouse, Data Lakes concepts.
  • Expertise in analyzing the data using Spark, HiveQL
  • Experience in importing and exporting data using Sqoop from Relational Database Systems(RDBMS) to HDFS and vice - versa
  • Excellent understanding of various data formats like XML, JSON, CSV, Avro, Parquet.
  • Expertise in writing complex SQL queries, Query performance optimization, Stored Procedures/functions, Data loads and data quality.
  • Expertise in developing intranet/internet applications using Java/J2EE techonologies (JSP, Servlets, JDBC, JNDI etc), other web languages like PHP, HTML/DHTML, CSS, XML, JavaScript/jQuery etc.
  • Experience in writing & executing Test Scripts for system, Integration, UAT.
  • Excellent technical, analytical and quick-troubleshooting skills.
  • Excellent modeling & presentation skills.
  • Hard-working, productive and invaluable team member.
  • Reputed as a quick learner & implementor of new tools
  • Enthusiastic to explore more into big data & cloud technologies
  • Avid user of Linux desktop & linux tools like Ubuntu, LinuxMint, CentOS etc.


Big Data/Data Science Technologies: Hadoop1/2, HDFS, Spark (Scala/Pyspark), Hive, Sqoop, Flume, Spark Streaming, MapReduce, Kafka, Zookeeper, Pig, Drill

Cloud Tools: AWS EC2, RDS, S3, SQS. SES, ELB (Knowledge of Lambda, EMR)

IDE/GUI dev Tools: PyCharm, Anaconda, IntelliJ, Eclipse, MS Visual Studio

Languages (General, database & Web): Python, Scala, Java, PHP, JavaScript/jQuery, SQL, PL/SQL, Confidential -SQL, HTML/DHTML/CSS, VB Script, XML, VBA

Siebel: Siebel 8.x/7.x, Configuration/Integration, Web Services

Databases: Oracle9i/10g,11g, MySQL, SQL Server, Hive (Knowledge of Hbase, Cassandra like NoSQL databases)

Database Tools: SQL Query Analyzer, Data Import/Export, SQL Profiler, Toad, MySQL Workbench, SQL Developer

Reporting Tools: Crystal Reports, SQL Server Reporting Services (SSRS), MS Excel

Testing Tools: PostMan, SoapUI, HP-QC, BugZilla, other custom testing applications

Distributed Technologies & Integration Tools: Python-Flask RESTful API, Java REST services, Web Services, Java Servlets, JSP, JDBC, Siebel Integrations (EAI, Web Services, HTTP Transports, MQ adapters, integration, VBC), Soap UI, SSIS, Jenkins

Servers/OS: Unix/Centos, Ubuntu, Tomcat, IIS, BEA Weblogic Server, Windows Server(s), Unix/Linux(Ubuntu, Linux Mint etc)

ETL Tools: SQL Server Integration Services (SSIS)

Build Tools: Maven, SBT, Ant

Misc. Tools: GIT, GITHub, SVN, Putty, MS Visio, MS Project, Ambari

SDLC Methodologies: Waterfall, Agile


Confidential, Chandler, AZ

Hadoop/Cloud Engineer


  • Analysis & design of the data imports into Loyalty RDBMS system & Hive.
  • Worked on many POCs for various types of data massaging/processing.
  • Design & development of simple RESTful services for real-time parts data into MySQL database using Python Flask-RESTful framework.
  • Reading Hive data using Spark, applying transformations & generating meaningful data needed for reporting/analytics.
  • Building the AWS RDS MySQL instances, and migrating the data into MySQL RDS instances
  • Working on importing delimited data into CRM system through various processes.
  • Creating EC2 instances, auto-scaling, load balancer, Security Groups etc.
  • Creating & managing data in S3 Standard Storage and Glacier
  • Worked with deployment & continuous integration using Jenkins.
  • Working with JSON, Soap, XML data.
  • Writing various SQL, HiveQL queries to analyze and validate the data.
  • Testing the REST API using PostMan & SoapUI

Environment: Python Flask-RESTful API, Python 2.7/3.5, Spark (PySpark), AWS, Java, Anaconda/Spyder, Eclipse IDE, venv, MySQL, Oracle, PostMan, Jenkins, Siebel 8.1 Loyalty application, Siebel Tools/Client, SQL, SQL Developer/Toad, Unix/Linux/Windows, Putty etc.

Confidential, Seattle, WA

Hadoop Engineer


  • Working on analysis, design & development of integrations between CRM & middleware application layer.
  • Loading the Loyalty application data using Spark Streaming into HDFS (Hive ) through Kafka pipelines
  • Created Hive internal and external tables with appropriate static and dynamic partitions, intended for efficiency.
  • Analyzed the data by performing Hive queries & Spark transformations.
  • Reading the delimited log file data using Apache Spark & analyzing the application exceptions/errors.
  • Loading the data from datawarehouse into Hive partitions
  • Reading Hive data using Spark & generating meaningful data/reports needed for analytics.
  • Extract the dataset into Spark RDDs/DataFrames to perform various operations or reporting
  • Built custom java Flume Sink to import delimited coupon data into staging table
  • Working with Siebel Data Model for data imports.
  • Developed the Technical Design for integrating Loyalty Accrual & Redemption transactions, Loyalty member, cards, coupons.
  • Troubleshooting & analyzing the data using SQL scripts.
  • Creating various test-cases & executing using SoapUI
  • Writing various SQL queries to analyze and validate the data.

Environment: Apache Spark, Spark Streaming, Kafka, Flume, Java, IntelliJ, Scala, Siebel CRM, Oracle 11g, SoapUI, SQL, PL/SQL, SQL Developer/Toad, Subversion, Sharepoint, HP-ALM/QC, Unix, Putty etc.

Confidential, Minneapolis, MN

Hadoop/Data Engineer


  • Worked on the analysis & design of Integration between Siebel & web applications using Shell Scripts, Web Services, Spark, SQL Scripts.
  • Created Hive tables to load the data.
  • Used Apache Spark RDDs & DataFrames to transform & analyze the Delivery schedules data stored in Hadoop cluster using PySpark (Python on Spark)
  • Writing various HiveSQL queries to analyze and validate the data.
  • Wrote the shell scripts & SQL/PL-SQL scripts to generate the CSV files of daily scheduled Service Requests for Cisco Auto-dialer.
  • Worked with cron jobs to generate SR/Activities data.
  • Worked with Siebel data model.

Environment: Apache Spark, Python/Scala, Siebel 8.1 Call Center Tools/Client, Apache Spark, Oracle 11g, SQL, PL/SQL, Shell Scripts, SQL Developer, Subversion, Sharepoint, HP-ALM/QC, Unix, Putty etc.

Confidential, Sunnyvale, CA

Sr. Technical Consultant


  • Designed & Developed the Java RESTful Services for integration with Siebel CRM
  • Built a tool using Java to generate & import the Product Eligibility Compatibility rules into Siebel.
  • Migrations/import& export of MySQL data.
  • Working with migration of Order management entities (Quotes, Orders, Fulfillment, Work Orders, Assets/MACD ).
  • Validating importing legacy Assets using SQL scripts.
  • Writing SQL queries to analyze and validate the data.
  • Working with Siebel Data Model & optimizing the Siebel SQL queries by analysing Oracle Execution Plans, creating/optimizing the database indexes.

Environment: Eclipse Neon, Java RESTful API, Tomcat8, MySQL, Oracle 11g, SQL Server, SQL, Confidential -SQL, SQL Developer/Toad, HP-ALM/QC, BI Publisher, Subversion, Sharepoint etc.

Confidential, Portland, OR

Data Engineer


  • Created Java RESTful Services needed for Portal integration for Member enrollment, Providers Query, Household query etc.
  • Working with Siebel data model for integration.
  • Working with Data Maps for data transformation.
  • Validating the data for Health Provider/Network/Plans.
  • Writing SQL scripts to analyze and validate the data.
  • Worked with configuring Siebel objects as per Oracle Expert Services review comments.
  • Exposure to Siebel Open UI.

Environment: Eclipse IDE, Java REST API, Tomcat7, Siebel Public Sector, Oracle 11g, SQL, Soap UI, SQL Developer, Maven, XML, JIRA etc.

Confidential, Portland, ME

Siebel CRM Integration Architect


  • Analysis, Design, Development of Siebel to B2B integration.
  • Working with Siebel real-time integration using HTTP Transports, Web Services, Runtime events and Workflow Policies
  • Develop various workflow processes that are invoked by runtime events
  • Created a RTE service layer to capture Runtime Events as jobs
  • Worked with Server Requests business service to submit workflow request jobs to specialized component.
  • Worked with Siebel Integration Objects for integration between Siebel & web applications.
  • Worked with various Siebel component jobs/RCRs.
  • Worked with HTTP transports to send IO/XML messages to a web application.
  • Siebel application configuration involving various Siebel objects using Configuration, eScripting, Server Component administration.
  • Working with various SQL scripts to verify data.
  • Working with SSIS packages for migrating data.
  • Working with Communications Outbound Manager, Communication profiles to send emails
  • Server administration tasks like full compilation, srf replacement, browser-scripts, component administration, Server shutdown/restart
  • Working on Siebel Data Model & analyze the SQLs.

Environment: Siebel 8.1 Call Center, Oracle 11g, SQL, SQL Server 2010, SSIS, HP ALM/QC, Soap UI, Putty, Unix, Toad, XML, Subversion etc.

Confidential, Santa Clara, CA

Siebel Integration Consultant


  • Analysis, Development, Testing of Siebel to Mobility integration.
  • Developed Siebel Inbound Web Services to enable Mobile Integration Platform to consume & display data in Mobile devices using mobile app.
  • Worked with the Integration Objects, its Components and keys to be used for data to be exchanged between Siebel & Mobile app.
  • Worked with Data Mapping to convert the data from one IO format to another
  • Developed custom workflows to enable the Mobile app to choose appropriate Statuses by integrating with Siebel State Model for Service Requests & Activities
  • Worked on Web Service WSDLs to generate the SOAP requests
  • Worked with the SSO & AD based integration using Session Tokens between SOAP requests
  • Siebel application configuration involving various Siebel objects using Configuration, Scripts (eScripting), Server Component administration.
  • Designing the High Level Design Documents for application integration.
  • Testing SOAP requests using Soap UI.
  • Working with SQL scripts using joins defined in Business Component.

Environment: Siebel 8.1 HTIM, Oracle 11, SQL, Soap UI, MS-Visio, MS-Project, HP Quality Center, XML etc.

Confidential, Culver City, CA

IT Specialist/Analyst


  • Analysis of the new development changes needed for the existing Siebel application by interacting with business users and proposing new solutions
  • Data imports using SQL loads and SSIS packages
  • Created Confidential -SQL Stored procedures to back the reports
  • Developed many reports using the SQL queries, stored procedures, Crystal Reports, SSRS.
  • Designing the High Level Design Documents for application integration.
  • Testing various internal applications.
  • Worked on development & support of a PHP/MySQL application
  • Exposure to the CapsPay payroll application using GoogleSDK for Java

Environment: SQL Server, SQL/ Confidential -SQL, Crystal Reports, SSRS, SSIS, Putty, Unix, Eclipse, PHP, MySQL, MS-Visio, XML etc.

Confidential, Jacksonville, FL

Siebel Analyst/Developer


  • Involved in analysis/design of product visibility using Catalog Categories & Access Groups.
  • Writing test scripts for testing the specific business requirements.
  • Close interaction with the business/end users for understanding the business requirements and the Test Scripts to be created.
  • Developing the Test Scenarios in Mercury Quality Center and interacting with various teams for integration testing.

Environment: Siebel Finance, Siebel 7.5 Client, Oracle 9i, HP ALM/QC, XML, SQL, PL/SQL, MS-Visio, Unix

Confidential, Milwaukee, WI

Siebel Techno-Functional Consultant


  • Analysis of Business Requirement related to Order Management objects (Opportunities, Revenues, Quotes, Activities).
  • Development of new Siebel application requirements using Application Configuration, eScript, Workflows, Assignment Manager, Data Maps.
  • Application configuration using data Tables, Business Components, Business Objects, Pick Lists.
  • Designed various Workflows as per the requirements
  • Worked on Application Internationalization using Symbolic Strings and apply them to various UI objects.
  • Worked with Data Maps for copying data from Opportunity to Quote.
  • Performed Code Review meetings to ensure the application confirms to Siebel recommended standards.
  • Wrote various Design Documents HLD, LLD and Review the design documents
  • Writing the Test Scripts (OQ) for testing application navigation to test UI and the negative testing
  • Executing test cases in HP Quality Center.
  • Provided support for testing phases (OQ, PQ) by providing the test data, analyzing and clarifying the issues as necessary and supported the users as a post go-live process
  • Extracting the component logs and troubleshooting various application issues

Environment: Siebel Life Sciences, Siebel 8.0 Tools & Client, SQL Server, Windows/Unix, HP Quality Center, XML, SQL, PL/SQL, MS-Visio

Confidential, Hoffman Estates, IL

Sr. Siebel Developer


  • Involved in migration of Siebel database from Oracle 9i to 10g.
  • Configured and setup database drivers needed that support Siebel 8.0.
  • Supporting various Queue & HTTP transport based integration modules
  • Setting up the new Development, Test and Production environments by Installing Database, Gateway, Siebel servers, SWSE on Windows servers.
  • Starting & shutting down of the Siebel server, gateway, and web server to replace srf or troubleshoot the application.
  • Experience in creating Database extract for the developers for their local database initialization.
  • Siebel Server administration and user management, roles and responsibilities.
  • Responsible for coordinating the requirements, design, development and testing.
  • Configured the Siebel objects Confidential business object, UI layers for enhancements required by the business.
  • Designed the table level Mapping between External Table/columns to Siebel EIM Table/columns
  • Worked with SQL loader utility to load the data from flat files onto Siebel temporary tables.
  • Developed the database procedures & triggers to process the data before entering the Siebel database.
  • Prepared IFB files and created EIM jobs to load Siebel base tables
  • Establishing the communication between Siebel and web application running on Tomcat web server.
  • Analyzing the Siebel application logs and troubleshooting the issues.

Environment: Siebel Financial, Siebel 8.0 Tools & Client, Windows Server 2005, Oracle 10g, Apache Tomcat, JSP, SQL, PL/SQL


Techno-Functional Siebel Consultant


  • Implemented Business Object layer by customizing/creating Tables, Business Components and their mappings to view the relevant data on UI.
  • Implemented Presentation layer by customizing/creating the UI objects like list/ form/ MVG/ Association Applet, Views, Screens.
  • Understand Siebel Data layer and create any EIM loads needed to import data into the relevant tables
  • Analysis and development of Integration business requirements for the data coming into Siebel from various interfaces (ODM, BAL, CMPAL).
  • Configured the specialized Drilldowns (static & dynamic) for navigation to various views/applets as per the business needs.
  • Devised and developed the workflows for processing the event-feed messages coming from interfaces.
  • Worked on Integration Objects that represent the data that is passed across systems through an MQ Series middleware.
  • Interaction with interface teams for bringing the consistency with the messages/XMLs exchanged between systems.
  • Designed the workflow that verifies the data content in the XML and insert the data in Siebel based on the content in XML.
  • Created a core API/Business Service to record/track all kinds of communications to the customer in the form of Email/SMS/Letter/Callback.
  • Created various Communication Templates for Email/SMS/Letter.
  • Worked with Inbound and outbound Web Services to process data with and from java/web applications.
  • Worked extensively with XMLs and their processing objects.
  • Modified the CSS/template files for adjustment of layout of the application.
  • Configured the customer Dashboard.
  • Worked with Component Requests and the workflows to be invoked periodically
  • Writing various design documents, LLD and HLD

Environment: Siebel Call Centre, Siebel 7.7 and 7.5.3 Tools/Client, Oracle 8, Unix, SQL Loader, XML, SQL, PL/SQL


Siebel Developer


  • Design the LLD and organizing review meetings
  • Configured the applets, views, screens objects on UI.
  • Administration of data visibility by creating the users, positions & their responsibilities
  • Configured the Drilldowns (static & dynamic) for navigation to various views/applets as per the business needs.
  • Configured the Pick Lists (static & dynamic) for getting data from LOVs or other business data.
  • Developed control files for Assets, Products data load from csv files to temporary tables using SQLLDR utility.
  • Developed PL/SQL scripts to process the data in temporary tables and transfer onto EIM tables.
  • Run EIM job to migrate the data onto Siebel base tables.
  • Run the versioning job workflow for the loaded products.
  • Developed eScript code to migrate the Assets data from Siebel to flat files.

Environment: Siebel Call Center, Siebel 7.5 Tools/Client, Unix, Windows 2000, SQL, PL/SQL

We'd love your feedback!