Sr. Hadoop Developer Resume
Plano, TX
SUMMARY
- Overall 11 years of Professional IT experience in Implementing, Supporting and testing with 6+ Years Implementing Siebel/OBIEE/Informatica and Over 4+ Years of Experience with Big Data/Hadoop Ecosystem.
- Self - starter, Excellent team player with strong consulting, Interpersonal, writing, Communication and Techno Functional Skills.
- Experience on Hadoop clusters using major Hadoop Distributions - CDH4 and CDH5.
- Experience on Major Hadoop ecosystem’s Components such as HDFS, MapReduce, YARN, HIVE, PIG, HBASE, Sqoop, FLUME, and SPARK, Kafka Job/workflow scheduling tools like Oozie, NiFi and monitoring them through Cloudera manager.
- Experience in working with Spark using Scala. Worked on Spark Streaming and SparkSQL.
- Hands on Experience working on NoSQL Database such as HBase.
- Created HBase tables, Hive tables to store large sets of Structured, semi-structured and unstructured from Various Sources.
- Experience in developing PIG Latin Scripts and HIVE Query Language for data Analytics.
- Expertise in Importing/Exporting data into HDFS from existing relational databases and Vice Versa using Sqoop.
- Involved in converting Hive queries into Spark transformations, Actions on RDDs using Scala and Python.
- Imported data from AWS S3 and Converted, transformed and Performed actions on RDDs, Data Frames, Datasets using Scala, Python.
- Extensively used ETL methodology for supporting Data Extraction, transformations and loading processing.
- Developed various dashboards in Tableau, used context filters, sets while dealing with huge volume of data.
- Constantly monitors the process of software configuration/development/testing to assure a quality deliverable with minimal defects.
- Research technical requests and other issues raised throughout all phases of Client projects
- Work with Expert Services to review design process and make sure we follow best configuration standards.
- Make sure standards of Quality Assurance (QA) are being met as part of ongoing development System in accordance with Client standards and the QA methodology
- Functioned as the primary contact for technical resolution and the Business.
- Good experience in troubleshoot production level issues and identifying the Root cause.
- Acts as the representative of Client and face customer for all technical, database, and infrastructure related issues.
TECHNICAL SKILLS
Applications: Communications, Finance, Life Sciences, Automotive, Call Center, eSales, eService, eConsumer, Production Support
Big Data: HDFS, Map Reduce, Hive, Pig, HBase, Kafka, Spark Streaming, Scala, Flume, Impala, Oozie, Nifi, Sqoop
Databases: Oracle, MySQL, Hbase
Programming Languages: Python,ScalaJava Scripting
Analytics: OBIEE, Tableau
Siebel Areas: Installation, Tools Config, Scripting, Workflows, EAI, Application Deployment Manger, Asset Management, Order Management, Quote Management, Pricing
Scripting: OOP Methodology, Visual Basic, COM, XML, Siebel VB, eScript
Integration: PL/SQL, JMS, MSMQ, EBC, VBC, Web Services, IBM MQ Series, TIBCO
PROFESSIONAL EXPERIENCE
Confidential, Plano, TX
Sr. Hadoop Developer
Responsibilities:
- Participated in business requirement gathering and translating requirements into technical specifications, development and testing.
- Developed solutions to process data into HDFS (Hadoop Distributed File System), process within Hadoop and emit the summary results from Hadoop to downstream systems.
- Worked on analyzing data in Hadoop cluster using different big data analytic tools including Pig, Hive, Spark and Sqoop.
- Responsible for building scalable distributed data solutions using Hadoop.
- Involved in importing and exporting data (SQL Server, Oracle, csv and other formats) from local/external file system and RDBMS to HDFS.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team and also Used Sqoop to ingest data in to HDFS.
- Used Hive and Impala to query the data in HBase.
- Developed Spark Application using Scala and Python for data extraction, transformations and loading.
- Worked on Spark streaming using Scala for real time reporting and monitoring.
- Knowledge in developing customized UDF's to extend Hive and Pig Latin functionality.
- Created HBase tables to store various data formats of data coming from different portfolios.
- Used Flume extensively in gathering and moving log data files from Application Servers to a central location in Hadoop Distributed File System (HDFS).
- Responsible to manage data coming from different sources.
- Developed workflow in NiFi to automate the tasks of loading the data into HDFS and pre-processing with Pig.
- Performed Text mining using Python scripts to extract Exact Information and relevant BI information.
- Used Collections in Python for manipulating and looping through different user defined objects
- Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 10g database.
- Integrated Hive with Tableau using Hive JDBC driver, for auto generation of Hive queries for non-technical business user.
- Integrated multiple sources data (SQL Server, DB2, Oracle) into Hadoop cluster and analyzed data by Hive-HBase integration.
- Worked on creating a live stream of data from traditional RDBMS using kafka connect, so it can be consumed by spark streaming.
- Understood complex data structures of different type (structured, semi structured) and de-normalizing for storage in Hadoop.
- Worked with offshore team to initiate development of application.
- Worked as the production support member resolving the production issues Raised by Business users.
Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, Spark, Scala, Oozie, NiFi, Flume, Kafka, Impala, Python, Shell Scripting and Tableau.
Confidential, Kansas City, MO
Hadoop Developer
Responsibilities:
- Involved in all phases of the project gathering business requirements, translating business requirements into technical specifications, development, testing.
- Designed Technical Design Documents and Functional Design Documents.
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster.
- Used Impala to read, write and query the Hadoop data in HBase.
- Used Sqoop to extract data from Oracle server and MySQL databases to HDFS.
- Involved in loading data from UNIX file system to HDFS and Manage Incoming Data.
- Developed workflows in Oozie for business requirements to extract the data using Sqoop.
- Developed MapReduce (YARN) jobs for cleaning, accessing and validating the data.
- Used Pig Scripts and HIVE quires to prepare source data for specific use cases and loaded data into specific Datamarts.
- Optimized the existing Hive and Pig Scripts and as we Learnt new Strategies.
- Automated the work flows to export data from databases into Hadoop.
- Designed workflows by scheduling Hive processes for Log file data, which is streamed into HDFS using Flume.
- Worked on Different File Formats such as XMLs, JSON and CSV.
- Used HDFS as a data staging area and then data is loaded into enterprise data warehouse.
- Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
- Involved in migration of some ETL process from Oracle 10g to Hadoop utilizing Hive as a SQL interface for easy data manipulation.
- Developed schemas to handle reporting requirements using Tableau.
- Actively participated in weekly meetings with the technical teams to review the code.
- Performance tuning existing solution and assisting with Production support in identifying bottlenecks and issues in current deployed solutions.
Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Flume, Oozie, HBase, Oracle 11G and Tableau.
Confidential, Kansas City, MO
Sr.Siebel Developer/OBIEE/ETL Developer
Responsibilities:
- Created New Siebel Screen, New Business Components with Visibility mode, New Pick list to make the Sales Process Improve their Business.
- Creating new Joins, Links, MVGs, Pick Applets and Business Component User properties.
- Created and Modified Applet Level Server and Browser Scripts.
- Created New Business Services for Creating Activity based on the Condition the user selects and to calculate the next available Business day using the Holiday calendar in Siebel, so that the Activity Date will be set using this functionality.
- Created New Indexes, New Predefined Quires using Julian Functions, New Symbolic URLS to integrate the Siebel Analytics Dashboards.
- Created a New Applet to display the Custom buttons throughout the Siebel Application so that user has more Clean Screen and big buttons on the screens.
- Created Siebel TASK Based UI from MY TODOs Screen on Activities to Complete the Task assigned to users.
- Developed Siebel Workflow for customer specific requirements. Creating new User Properties and Workflow Polices to trigger Workflow process.
- Created List of Values, State Model, Business Services, BC, Applets, Views and Screens.
- Developed PL SQL Procedures to Load Data from Different Interfaces into Siebel using EIM.
- Responsible for Compiling the Siebel Server Tools and Replace the SRF on Siebel Servers.
- Upgraded OBIEE 10g to 11g with the Help of Third Party Vendor.
- Installed Complete OBIEE 11g Suite in Development Environment.
- Developed Many OBIEE Reports for Users to carry out their Day to Day Business.
- Created new iBots, Dashboards and Migrated Reports from different Environments. .
- Migrating the Repository from One Environment to Different Environments.
- Created New Connection pools in RPD to meet the User Requirement.
- Scheduled Reports to be delivered to users using delivers and Email delivery Option.
- Implemented security at different levels basing on the Requirement.
- Applied several patches for the OBIEE 11g suite to solve some of the critical Errors.
- Created New ETLs to load new Data in to Data warehouse using ETL.
- Used DAC Client to Manage, Configure and Monitor ETL Process.
- Involved in trouble Shooting of Reports Failures, iBot Failures and Caching Failures In Production.
Environment: Siebel 8.1.1 Siebel Finance, Siebel Analytics, Configuration, Scripting, Workflows, EAI,TASK UI, EIM, Excel Reports, Oracle 11g,, Informatica, Web logic server and Testing
Confidential, AKRON, OH
Sr. Siebel Developer/OBIEE/ETL Developer
Responsibilities:
- Involved in all phases of the project understanding business requirements, translating business requirements into technical specifications, development, testing.
- Worked on configuration extensively for highly customization. Created new Siebel objects under the UI, BO and DO layers to achieve business functionalities.
- Created Joins, Links, MVGs, Pick lists, Pick Applets, BC User properties, Calculated Fields, Applets, BO’s, Integration Objects, New Business Services and Server and Browser Scripts.
- Developed Workflows for customer specific requirements. Creating new User Properties to invoke Workflow process based on user changes from BC.
- Implemented New Integration Methodology like Posting the Messages to Queue Tables using middle ware Java Messaging System.
- Created and Modified the Existing Workflows to Post the Messages in to the JAVA Messaging System Replacing Synchronous Integration.
- Using Standard Data Transformation Business Services like EAI XML read from File, EAI XML write to file, EAI JMS Transport to enable Transformation of Data in Siebel EAI.
- Debugging and Maintaining Siebel Workflows.
- Created EBC, VBC to access the date from Oracle Trade Management application database.
- Created new Inbound/Outbound Web service and tested with SOAP UI.
- Worked on Product, Accounts, Promotions, Deductions, Claims, Tactics, Sales Volume Planning, Funds Creation and Administration of Funds Modules.
- Created a New Work flow Process manager Component.
- Created New Profile configurations to make the JMS Queue Connection and EBC Connection.
- Developed PL SQL Procedures to Load Data from Different Interfaces into Siebel using EIM.
- Design, developed ETL packages to transform data using stored procedures and functions.
- Created ETL process to fetch data from various sources to target database using Informatica.
- Created IFB Files and KSH Files for EIM, Database views and Queue tables for JMS Queue.
- Responsible for Compiling the Siebel Server Tools and Replace the SRF on Siebel Servers.
- Clarifying technical issues within team & guiding supporting team in their activities.
- Created OBIEE Reports, BIP Reports and OBIEE Dashboards and integrated in to Siebel.
Environment: Siebel 8.0.10 Consumer Goods,, Configuration, Scripting, Workflows, EAI,JAVA Messaging System, EIM, Excel Reports, Oracle 10g, PL SQL, Informatica, OBIEE 10g and Testing
Confidential, Minneapolis, MN
Sr. Siebel Developer
Responsibilities:
- Involved in all phases of the project life cycle including understanding business requirements, translating business requirements into technical specifications, development, testing.
- Gathered and reviewed requirements for various business user units.
- Responsible for Functional design and Technical design document creation which includes customizations, extensions and Integrations.
- Actively worked on Support, troubleshoot and resolve technical issues relevant to prototype to achieve functional requirements.
- Involved in configuration of many requirements which involves many Joins which also includes Join Constrain and LINKS, Server scripts, Browser Scripts and Business Services.
- Created Run-time Events to invoke Custom business services during the assignment of service request for call center agents.
- Developed the Inbound/Outbound Workflows and Work flow Policies.
- Created Job Templates for Repeating Component Request for Integration Needs.
- Tested internal and external integration object as per interface requirement.
- Worked on Integration between Siebel and SAP applications.
- Developed an Interface between Siebel and external Systems of FedEx and DHL for Tracking the Package Delivery Status. Used MQ Series as a middle Ware for Communicating.
- Involved in EAI Integration with SAP for communicating with SAP for Placing Orders.
- Extending error and exception handling framework (Oracle AIA PIP Extension Service).
- Testing and validation of the PIP extension using end-to-end integration testing scenarios.
- Worked in Production Support and was primary go to person for the Business Users.
- Prepared technical design documents, Test Cases and performed Unit testing for the developed requirements and also Regression Testing.
- Worked extensively on Excel Report - Connect to Siebel Database using excel macro, get data and display the results in excel sheet.
- Worked with offshore team to initiate development of application.
- Design and Develop interfaces for EIM based on client requirements.
Environment: Siebel 7.7.2.12 Life Sciences, Siebel Tools, Workflows, Java Scripting, EAI, EIM, Testing, Production Support, Workflow Policies, Excel Reports and Actuate Reports, Oracle 10g.
Confidential, PA
Siebel Developer
Responsibilities:
- Involved in design and development of User interfaces Layers like Applets, Views, and Screens etc. and modified certain applets using Siebel Tools
- Creating new Joins, Links, MVGs, Picklists, Pick Applets, BC User properties.
- Developed New Siebel Workflows for customer specific requirements. Creating new RCR, Workflow policies for Run Time Events and Debugging Existing Workflows.
- Using Standard Data Transformation Business Services like EAI XML read from File, EAI XML write to file to enable Transformation of Data in Siebel EAI
- Extensively involved in Creating Functional and Technical Design Document.
- Worked on Siebel EAI & configuration extensively for highly customization. Created new Siebel objects under the UI, BO and DO layers to achieve business functionalities.
- Worked on Workflow, Assignment Manager & Siebel script.
- Developed integration solutions using the EAI web methods Transport service and eScript
- Configured Internal Integration Objects and External Integration Objects.
- Worked on Inbound & Outbound integration using Web services and HTTP.
- Development of Siebel Workflow for customer specific requirements.
- Performing Server/Browser side scripting in Applet, BC and Business Service level.
Environment: Siebel 8.0, Configuration, Java Scripting, Workflows, EAI and EIM
Confidential
Siebel Developer
Responsibilities:
- Involved in all phases of the project life cycle including understanding business requirements, translating business requirements into technical specifications, development and testing.
- Configuration of BO, BC, Join, Link, Picklists, Dynamic drilldown objects, MVG, applet, screens, Pick Applets and views, New Responsibilities for new Requirements.
- Configured MVGs, Dynamic Pick Lists for addresses on Account & Contacts.
- Extended Siebel Base Tables to meet the Business Requirement.
- Added scripts on Business Components, Applets and Created New Business Services.
- Configured New Integration Objects, Access control mechanism for visibility.
- Designed and Configured Workflow processes for the automation of the business tasks.
- Used Workflow Manager for Integration flow between applications
- Creating new RCR/ Workflow policies and Debugging and Maintaining of Siebel Workflows.
- Using Standard Data Transformation Business Services like EAI XML read from File, EAI XML write to file to enable Transformation of Data in Siebel EAI.
- Extensively worked on VBC’s, EBC’s, and inbound/outbound web services.
- Development of SQL scripts to extract data from legacy system and update in Siebel.
- Using Siebel EIM, IFB Files for Import, Update and Delete functionality of Siebel EIM.
- Developed the commonly used routines (EIM), that other developers can call on their programs. The purpose of this is to minimize coding and to standardize the program scripts.
- Created custom interface tables to store Client’s data prior to loading into Siebel tables.
- Unit, Integration and regression testing, and support for production environment.
- Provide business with an extract of data from Siebel using PL/SQL scripts.
- Worked with Tier II Production Support team closely and providing the answers they need in a timely manner.
Environment: Siebel 7.7, Siebel Configuration, Java Scripting, Workflows, EAI, EIM and Oracle.
