Hadoop Developer/ Administrator Resume
Dallas, TX
SUMMARY:
- Over all 9+ years of experience in Information technology
- Hands - on experience with "Productionalizing" Hadoop applications (such as administration, configuration management, monitoring, debugging, and performance tuning)
- Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce.
- Strong Knowledge of Hadoop and Hive and Hive's analytical functions.
- Efficient in building hive, pig and map Reduce scripts.
- Implemented Proofs of Concept on Hadoop stack and different big data analytic tools, migration from different databases (me.e., Teradata, Oracle, MYSQL) to Hadoop.
- Successfully loaded files to Hive and HDFS from MongoDB, HBase
- Loaded the dataset into Hive for ETL Operation.
- Good noledge on Hadoop Cluster architecture and monitoring the cluster.
- Experience in using DBvisualizer, Zoo keeper and cloudera Manager.
- An insightful Oracle DBA with hands on experience with Oracle 9i, 10g, 11g.
- Expert-level skills in Installation, Configuration, Upgrading, Patching, Troubleshooting and Performance tuning of oracle databases on different operating platforms.
- In-depth experience with all facets of Database Design, Backups and Recovery, Security Management, Disk Space and File System Management and Monitoring.
- Expertise in using backup and recovery strategies, using cold backup, hot backup and RMAN methodologies.
- Knowledge in Cloning databases for development/testing environments using traditional backup methods such as Export/Import/Data pump utilities. as well as using RMAN.
- Experience in Upgrades of Oracle databases from oracle 9i, 10g to oracle 11gR1/R2 using DBUA.
- Worked on migration of oracle database from Solaris to Red Hat Linux on development, integration and production environment using Oracle Data Migration Assistant.
- In depth noledge of Oracle Database Tuning, Usage of SATSPACK, Database Migration, AWR, ADDM, ASH and UNIX native commands.
- Experienced in Golden Gate Installation and Configuration for database replication purpose.
- Successfully implemented Physical and Logical standby database using Oracle Data Guard feature for High availability configuration.
- Developed RMAN scripts for daily/ weekly backup & recovery. Used Logical backup as full/ partial schemas/database exp /imp, SQLLDR and Data Pump.
- Worked extensively on OLTP/Data warehouse database systems.
- Worked on Sys admin tasks like creating and managing users and their responsibilities.
- Designed backup policies for Single node and multi-node environments using RMAN.
- Proficient in developing Oracle PL/SQL based applications using Oracle Stored Procedures, Functions Packages, Triggers, Materialized views.
- Efficient in SQL query optimization and instance tuning using STATPACK, TKPROF, SQL Trace, Explain Plan, AWR and ADDM reports, gathering statistics, pinning the objects, Running scripts.
- Install & configure Oracle Enterprise Manager (OEM) Grid Control to maintaining and monitoring databases.
- Expertise in applying various RDBMS & CPU patches using OPatch utility.
- Experience with FTP, SFTP, SCP files on cross platforms
- Experience in writing UNIX Shell scripts to automate business process and daily backup, maintenance, enhancement, designing of data dictionaries and performance tuning.
- Managed many OLTP/Data warehouse databases.
- Installed and configured 2 nodes server side load balancing RAC (Real Application Clusters) databases on Linux with ASM (Automatic Storage Management) and Data Guard.
- Experience in full life cycle of database administration. Provide 24x7 production supports as on call DBA.
- Strong analytical and problem solving skills. Ability to work within team environment and independently when needed.
TECHNICAL SKILLS:
Big Data: Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Cassandra, Oozie, Flume, Talend, Solr, ElasticSearch, Weka
Java Technologies: Java 5,Java 6, JAXP, AJAX, I18N, JFC Swing, Log4j, Java Help API
J2EE Technologies: JSP 2.1 Servlets 2.3, JDBC 2.0,JNDI, XML
Operating System: AIX, Solaris, OEL, RHEL Linux, Windows 2008, Windows2005, Windows 2003
Programming Languages: PLSQL, SQL, PERL
Database: Oracle 9i, Oracle 10g, Oracle 11g, MySQL
Tools: Oracle SQL Developer, Putty, WinSCP, Toad, PC Anyware,VNC, Xmanager, Citrix server
Application Servers: Oracle R12 eBusiness Suite
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Hadoop Developer/ Administrator
Responsibilities:
- Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReducejobs in Java for data cleansing and preprocessing.
- Evaluated business requirements and prepared detailed specifications that follow project
- guidelines required to develop written programs.
- Responsible for building scalable distributed data solutions using Hadoop.
- Analysed large amounts of data sets to determine optimal way to aggregate and report on it.
- Handled importing of data from various data sources, performed transformations using
- Hive, MapReduce, and loaded data into HDFS.
- Importing and exporting data into HDFS using Sqoop.
- Wrote MapReduce code to make un-structured data into semi- structured data and loaded into Hive tables.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting.
- Worked extensively in creating MapReduce jobs to power data for search and aggregation
- Worked extensively with Sqoop for importing metadata from Oracle.
- Extensively used Pig for data cleansing.
- Created partitioned tables in Hive.
- Managed and reviewed Hadoop log files.
- Involved in creating Hive tables, loading with data and writing hive queries which will runInternally in MapReduce way.
- Used Hive to analyse the partitioned and bucketed data and compute various metrics for Reporting.
- Installed and configured Pig and also written PigLatin scripts.
- Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
- Created Hbase tables to store various data formats of data coming from differentportfolios.
- Developed MapReduce jobs to automate transfer of data from Hbase.
- Used SVN, Tortoise SVN version control tools for code management (checkins, checkoutsand synchronizing the code with repository).
- Worked hands on with ETL process.
Environment: Hadoop, MapReduce, Hive, HBase, HDFS, Hive, Java (JDK 1.6), Linux, ClouderaMapReduce, Oracle 10g, PL/SQL, SQL*PLUS, Toad 9.6, UNIX Shell Scripting, Eclipse.
Confidential, Horsham, PA
SQL/Java developer
Responsibilities:
- Involved in complete requirement analysis, design, coding and testing phases of the project.
- Implemented the project according to the Software Development Life Cycle (SDLC).
- Developed JavaScript behavior code for user interaction.
- Used HTML, JavaScript, and JSP and developed UI.
- Used JDBC and managed connectivity, for inserting/querying& data management including stored procedures and triggers.
- Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 9i database.
- Part of a team which is responsible for metadata maintenance and synchronization of data from database.
- Involved in the design and coding of the data capture templates, presentation and component templates.
- Developed an API to write XML documents from database.
- Used JavaScript and designed user-interface and checking validations.
- Developed JUnit test cases and validated users input using regular expressions in JavaScript as well as in the server side.
- Developed complex PL/SQL stored procedures, functions and triggers.
- Mapped business objects to database using Hibernate.
- Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
- Wrote PL/SQL queries, stored procedures and database triggers as required on the database objects.
- Developed and maintained Ant Scripts for the build purposes on testing and production environments
Environment: Java, Spring, XML, Hibernate, Oracle, PL/SQL, ANT, Maven2, Continuum, JUnit, SVN.
Confidential, Kansas city, KS
Hadoop Developer
Responsibilities:
- Installed and configured HadoopMapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Importing and exporting data into HDFS and Hive using Sqoop
- Experienced in defining job flows
- Experienced in managing and reviewing Hadoop log files
- Experienced in running Hadoop streaming jobs to process terabytes of xml format data
- Load and transform large sets of structured, semi structured and unstructured data
- Responsible to manage data coming from different sources
- Supported Map Reduce Programs those are running on the cluster
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way
- Gained very good business noledge on health insurance, claim processing, fraud suspect identification, appeals process etc.
Hardware/Software: Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6), Hadoop distribution of HortonWorks, Cloudera, MapR, DataStax, Confidential DataStage 8.1(Designer, Director, Administrator), Flat files, Oracle 11g/10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting, Autosys r11.0.
Confidential, Eden Prairie, MN
Oracle DBA
Responsibilities:
- Maintaining Oracle databases working for Oracle E-Business Suite, SAP, Peoplesoft application
- Supporting 11gR2 RAC with ASM on AIX, Linux
- Working on Oracle database related tickets (SEIBEL)
- Refreshing test / dev databases using RMAN backup
- Configuring RMAN backups
- Monitoring databases through OEM grid control
- Daily backups checking
- Installing & Configuring Oracle in new servers
- Working with iLo & Xmanager while installing Oracle on new Linux servers
- Disaster Recovery planning, configuration & maintenance
- Configured Oracle GoldenGate(11.2.1.0.1) for table level replication
- Cloning of Databases for QA and Development from Production databases
- Identifying the poorly performing queries using Explain Plan, TKPROF, and assisting the developers in optimizing and tuning the SQL and PL SQL code using SQL trace.
- Provided Change Management, Incident Management, Problem Management, Service Request Management, Task Management using BMC Remedy tools
Environment: Oracle 10g (10.2.0.4), Oracle 11g, Oracle 10g RAC, Data Guard, AIX, OEL 5.7, Solaris, Windows 2008, Windows 2005, Putty, Xmanager, WinSCP.
Confidential
Oracle DBA (RMAN Backup)
Responsibilities:
- Working on failure SRs and resolving the backup related issues.
- Installed, configured, created and maintained Oracle databases.
- Performed database backup and recovery using recovery manager.
- Implementing RMAN backups to newly build servers.
- Extensive experience in RMAN recovery testing, RAC and stand alone database.
- Cloned databases using scripts as well as RMAN. Installation, setup and configuration of Data guard. Refreshed schema using Export/Import. Created the database, table space, tables, indexes, set privileges and user logins.
- Identifying Resource Intensive SQL Queries and Give them to Development Team to TUNE the SQL Queries.
- Cloning / data refreshing of test and development databases based on business users requirement. Optimizing RMAN backup scripts.
- Giving on-call DBA support.
Environment: Confidential AIX 5.4 (64bit edition), Linux & Windows.
Confidential
Database Administrator (DBA)
Responsibilities:
- Maintaining Oracle databases working for IFS application
- Implemented and supported 11gR2 RAC with ASM on OEL 5.7
- Working on Oracle database related tickets (iServe)
- Refreshing test / dev databases using RMAN backup
- Configuring RMAN backups
- Monitoring databases through OEM grid control
- Daily backups checking
- Re-Org databases on regular intervals
- Installing & Configuring Oracle in new servers remotely
- Working with iLo & Xmanager while installing Oracle on new Linux servers
- Disaster Recovery planning, configuration & maintenance
- Configured Oracle GoldenGate(11.2.1.0.1) for table level replication
Environment: OEL 5.7, Windows 2005, Putty, Xmanager, WinSCP.
Confidential
Database Administrator (DBA)
Responsibilities:
- Maintaining ERP IFS Application.
- Giving project access rights to the application users.
- Administering oracle 10gR2 RAC database
- IFS Server & Client application installation.
- Configuring / Installing WEB module.
- Deploying presentation objects in application.
- Applying BUG Fixing / Report patches to application.
- Cloning the database on regular intervals.
- Granting roles & privileges to the Oracle users.
- Maintaining UAT database.
- Taking RMAN & Export backup every day.
- Periodic trimming of log files.
- Database space management.
- Periodic Re-Organizing of Database.
- Evaluating & Enhancing performance of database by analyzing AWR &ADDM reports.
- Monitoring database activity through Oracle enterprise manager and killing anyhanging sessions.
- Reducing the wait events by distributing IO across various physical disks(Getting input through monitoring OEM continuously at heavy business hours)
- Identifying Resource Intensive SQL Queries and Give them to Development Team to TUNE the SQL Queries.
- Rebuilding unbalanced indexes.
- Compiling invalid objects after patch deployment.
- Pasting new EXEs’ in CITRIX presentation server.
Environment: Windows 2003, Oracle 9i,10g.
Confidential
Junior DBA
Responsibilities:
- Installation of Oracle 9i windows.
- Patch up-gradation from 9.2.0.1.0 to 9.2.0.8.0.
- Maintaining Oracle 10g database on Solaries5.0
- Checking DR database.
- Configured DataGuard using rman backup
- Monitoring the Datagaurd to synchronize with Production Database.
- Switchover of Physical standby database for role changing in order to do routine maintenance on Primary database.
- Ensuring Health of Database by Monitoring alert log and Archive Generation.
- Sizing of database, space management (Sizing of tables, indexes, rollback segments & tablespaces).
- Generating statspack report weekly once.
- Export,Import & Rman backups.
- Analyzing tables of database daily, weekly basis.
Environment: Solaris 9, Windows 2003, Putty, VNC, WinSCP