We provide IT Staff Augmentation Services!

Bigdata / Hadoop Administrator Resume

Providence, RI


  • More than 10 years of total IT experience with complete knowledge of SDLC.
  • Around 2 years of experience in Hadoop Administration
  • Six years of experience in Documentum Product Suite 6x and 5i.
  • Good experience working with Cloudera Distribution
  • Experience in installation, configuration, supporting and managing Hadoop clusters
  • Worked in multiple environment in installation and configuration
  • Experience in analyzing existing Hadoop Cluster, understanding the performance bottlenecks and providing the performance tuning solutions accordingly
  • Good knowledge on Hadoop cluster architecture and monitoring cluster
  • Experience in data transfer from structured data stores to HDFS using sqoop
  • Experience in setting up cluster monitoring tool Ganglia
  • Good understanding of distributed systems and parallel processing architectures
  • Extensive experience with Documentum Content Server, Documentum Administrator, Webtop, Desktop Client, Documentum Compliance Manager, Digital Asset Manager, Documentum Composer, Documentum Application Builder, DQL, and API.
  • Expertise in Installing, Configuring and Migrating Documentum Products.
  • Involved in Documentum Development and Administration dat included creating and maintaining
  • Methods, Object Types, Attributes, Users, Roles, ACLs, Jobs and addressing Rendition needs.
  • Expertise in handling Documentum Production support.
  • Trained in third party vendor Migration Utility tools and participated in multiple Migration projects.
  • Interacted with the users to resolve their queries and halp in the better understanding of the applications.
  • Exposure with 21 CFR Part 11 of FDA, validation on Electronic Global repositories for pharmaceutical clients.
  • Hands on experience with application servers Apache Tomcat, IBM WebSphere and BEA Web logic.
  • Proficient in Object Oriented Design Methodologies and all phases of the Software Development Life Cycle model.
  • Good knowledge of complete Lifecycle of IT project including Requirements gathering, Design, Development and Implementation.
  • Demonstrated adeptness in interpersonal skills through working within integrated team environments.


Documentum Suite: Documentum Content server 6x/5x, WebTop, Web Publisher, Documentum Administrator, Documentum Compliance Manager, Digital Asset Manager, Desktop Client, Documentum Transformation Services (DTS), Advanced Transformation Services (ADTS), xPlore, D2, xCP, BPM, Documentum Composer, Captiva, Kofax, SharePoint, Documentum Application Builder (DAB), Workflow Manager.

Hadoop: Cloudera Hadoop (Big data), HDFS, MapReduce, Hive, Impala, Sqoop, Flume, Oozie, Ganglia

Databases: Oracle, PL/SQL, SQL Server, MYSQL

Programming Languages: Java, XML, DFC, Docbasic and Documentum DQL/API

Platforms: UNIX, Linux, Windows 7 and Windows NT/XP/2000

Web/Application Servers: Apache Tomcat, IBM Web sphere and Web logic.

Scanning Solution: Captiva Input Accel

Migration Utilities: Trumigrate and Trucompare

Office Software: MS - Word, MS-Excel, MS-PowerPoint, MS-Windows, MS-ProjectMS-Visio and Adobe Photoshop.


Confidential, Providence, RI

BigData / Hadoop Administrator


  • Environment-Apache, Cloudera, actively involved in Planning, installing, configuring, maintaining and monitoring Hadoop Clusters
  • Hands on Experience on major components in Hadoop Ecosystem including HDFS, Hive, Flume, Zookeeper, Oozie, Sqoop and Ganglia
  • Imported/exported data from RDBMS to HDFS using Data Ingestion tools like Sqoop
  • Commissioning and Decommissioning nodes to Hadoop Cluster
  • General system monitoring for machines running a Hadoop Cluster through Cloudera Manager and Ganglia(For monitoring parameters like disk space, disk partitions, etc)
  • Involved in creating Hive Internal/External tables, loading with data and troubleshoot with Hive jobs
  • Worked on configuring security for Hadoop Cluster
  • Managing and scheduling jobs on a Hadoop Cluster
  • Managing user permissions and creating new users
  • Created shell scripts to clean the log files and check the disk space after cleaning, to restore data node, to copy data across different cluster
  • Tuning of MapReduce configurations to optimize the run time of jobs
  • Hive and Hbase familiarity
  • Experienced in managing and reviewing Hadoop log files
  • Understanding of FIFO, Fair Scheduler and Capacity Scheduler and their usage
  • Enable Mount on NameNode from EdgeNode
  • Having good knowledge of SQL and Databases
  • Responsible for the Content Management System of the Documentum docbase and Operational Support for 4 Repositories
  • Daily Monitoring and Checks of All Logs and Job Output in the servers
  • Daily Monitoring of Backups and Tape Status through HP Data Protector and send the information to the Software Operations Team so dat the tapes are swapped in the HP Juke Box
  • Monitor all the Documentum servers and the disk space allocated to them
  • Resolve all the tickets raised by users through Easy Vista - internally in Confidential Corporation
  • Operational Support for WebTop, Web Publisher, DA, DTS and Input Accel
  • Key role in the migration to SharePoint from Documentum by using Open Migrate and Share Gate tools
  • Security (ACL) Creation & Management through DA and apply read-only ACL to the documents dat are migrated to SharePoint

Environment: Documentum Content Server 6.5 SP3, DA 6.5 SP3, DTS 6.5 SP3, Webtop 6.5 SP3, WebPublisher 6.5 SP3, Input Accel, SharePoint 2013, DFC, DQL, API, IAPI, IDQL, XML, Tomcat, Remote Servers, HP linux,WinSCP, HP Data Protector, HP Exceed, SAM, Cloudera Hadoop (Big data), HDFS, MapReduce, Hive, Impala, Sqoop, Flume, Oozie, Ganglia

Confidential, Orlando, FL

Sr. Documentum Consultant


  • Team member of the upgrade project of Documentum 6.5 SP3 to Documentum 6.7 SP1
  • Team member of the EDC Rehost project of the Content Servers from WTC datacenter to EDCw datacenter
  • Responsible for the Content Management System of the Documentum docbases
  • Performed dump and load of CCMR repositories from one test level to another
  • Performed Content Server installations,configuration and applied patch 14 and hotfix for the 6.7 SP1 content server
  • Performed Content Server installations and applied patch 19 for the 6.5 SP3 content server
  • Installed xPlore 1.2 on all the test level and Production servers
  • Installed ADTS 6.7 SP1 on all the test level and Production servers
  • Installed and configured scanning solution Kofax
  • Installed DFC 6.7 on all the test level and Production Webservers and troubleshooted any DFC related issues in the Webservers
  • Installed BPM for one of the clients for the test level and Production servers
  • Configured LDAP server configuration and run the dm LDAPSynchronization job through DA and the shell scripts
  • Support, Administration and provided end user support of SharePoint server
  • Created Register Tables in all the test levels and production as per the request of the team members
  • Created Value Assistance tables, doc types as per the requirement
  • Enabled Audit Trails and provide daily reports for the requested Business groups
  • Created Centera Storage object through DA and validated the filestore by creating and migrating documents to Centera filestore
  • Created, configured and used a case based application with xCP
  • Fair knowledge of D2 structure and its working procedure
  • Resolve all the tickets raised through RT’s - internally in Confidential
  • Deployed most of the applications for the Developers in all the test levels and Production Webservers
  • Worked in Agile methodology with 2 weeks Sprint session and daily standup meeting
  • Participated in the every morning Production Support calls and issue resolution

Environment: Documentum Content Server 6.5 SP3/6.7 SP1, DA 6.7 SP1, ADTS 6.7 SP1, Webtop 6.7 SP1, xPlore 1.2, Kofax, SharePoint 2013, BPM, D2, xCP, DFC, BPM, DQL, API, IAPI, IDQL, XML, Weblogic, Jboss Servers, Tomcat, Remote Servers, Unix,WinSCP

Confidential, Bernardsville, NJ

Sr. Documentum Consultant


  • Installed Content Server 6.7 SP1, DA, Webtop and DTS 6.7SP1 on Windows 2008 servers
  • Customized user interface for Checkin/Checkout, Import, New Document and Advanced Search components
  • Configured LDAP servers and run the dm LDAPSynchronization job through DA
  • Enabled Audit Trail for the specific project types
  • Installed DAR files using Composer and Headless Composer in the repository
  • Performed Data Migration using Migration Utility Tools TRUMigrate and TRUCompare dat involved querying and analyzing the data in the system
  • Executed queries through DQL and API for any updates or changes required after the migration
  • After Export of the metadata in Access Database designed queries and macros in the MBD to manipulate the data as per the requirements of the business client
  • Create and execute IQ, OQ and MQ scripts

Environment: Documentum Content Server 6.7 SP1, DA 6.7 SP1, DTS 6.7 SP1, Webtop 6.7 SP1, DQL, API, IAPI, IDQL, XML, Migration Utilities- TRUMigrate and TRUCompare,Tomcat, Remote Servers

Confidential, Collegeville, PA

Sr. Documentum Consultant


  • Involved in all stages of the SDLC right from the requirement gathering to deployment and production support
  • Work with the business SME’s to analyze the current process and the go forward process
  • Conduct Working Sessions with the business to follow the Content Analysis process and the attribute and property mapping of the data from source to target systems
  • Create mapping specifications as per the target system GDMS, Gnosis or DLTS
  • Develop mapping specification as per the business requirement and make sure the project documentation completed in timely manner
  • Execute in-house tools Content Extraction Tool and Product Search Tool and post the deliverables in the SharePoint VIPER Team Site
  • Also was supporting the administration of the SharePoint website and requests from the team members for the site
  • Develop DQL/API queries using the Query Tools such as DQMan and Samson
  • Configure the Mapit Tool with the source extraction files provided by the Migration Engineers team
  • Check the results from the Mapit tool to the mapping specification to ensure the results were as expected
  • Support and assist team members in migration development tasks
  • Assist Validation Team with the query tools to halp in functionality, attribute and mapping testing
  • Work with testing and support team to meet migration deliverables
  • Was the lead for Mapping Specification Checklist as a guidance to other Migration Analysts

Environment: Documentum Content Server 5.3 SP2/6.5 SP3, DA, DCM, Webtop, XmLabeling, Desktop Client, DQL, API, IAPI, IDQL, SharePoint 2010, Extraction and Product Search tools, DQMan, Lifecycle

Confidential, Collegeville, PA

Documentum Consultant


  • Was involved in 5 Migration Projects till date i.e. ELIS, JEDMS, SMwebdoc, Fusion and Itapevi
  • Execution of the Migration Utility Tools TRUMigrate and TRUCompare for Data Migration which involves querying and analyzing the data in the system
  • Migrated Controlled and Non-Controlled documents into EDMS docbase from other docbases
  • For ELIS Project migrated different types of documents like Artworks, Drawings, Label Text and Core Data Sheets. Executed the SAP DIR creation tool to create linkages between the documents
  • For SMwebdoc project created folders with the Migration Utility tool and updated the content with DQL queries
  • Execution of DO METHOD for the Migration which would assign the folders and documents ACL and migrate to the exact folders as per the Subclasses configured in EDMS
  • Assign ACL, Lifecycle and mandatory attributes for a Controlled document through the XML configuration used by the tool
  • Executed queries through DQL and API for any updates or changes required after the migration
  • Provided on-going support with any issues involved with post migration
  • Performed quality check queries for the controlled and validated data migrated into EDMS
  • After Export of the metadata in Access Database designed queries and macros in the MBD to manipulate the data as per the requirements of the business client
  • Assisted the Business Analyst and the Client for the requirements gathering and attribute mapping which is the key in Migration Process
  • Involved with the IT Department and Technical teams for the process analysis, design, watermark and overlays for the PDF rendition of the projects
  • FTPed content and files between remote servers with Filezilla
  • Used Collaborative Services for the enhancement of teamwork
  • Created and reviewed Work Instructions for few of the Projects involved
  • Conducted and provided knowledge transfer sessions to the IDC team in India

Environment: Documentum Content Server, DA, DCM, Webtop, Desktop Client, DQL, API, IAPI, IDQL, DFC, Oracle, XML, Migration Utilities- TRUMigrate and TRUCompare, Lifecycle, Java, Javascript, HTML5, Win NT/2000 server, Remote Servers

Confidential, Houston, TX

Documentum Administrator


  • Installed and configured Documentum Desktop Client and the scanning solution Captiva Input Accel
  • Created users, groups, ACL’s, Security Model and custom objects and object types using DA and DAB
  • Authenticated users through LDAP Configuration
  • Created and archived DocApps with custom object types, Attributes using Application Builder and Application Installer depending on the business needs
  • Performed Records Migration Jobs to migrate documents of custom object types in different Docbases from Filestore storage area to Content Addressed Storage Centera
  • Created custom Job and Method to keep track of the growth in weekly basis of custom object types in the Centera storage and calculated the percentage of growth of these object types
  • Provided on-going support, maintenance and created reports for the Jobs performed in the different Docbases. Also had to keep track of the timings and frequency of the jobs running
  • Designed test cases for different applications as per the requirement of the client
  • Assisted in Migration project Remittance wherein there was image splitting done and then migrate the documents to the Repository chronologically
  • Involved in requirements analysis, process analysis, design and complete development of the IT Process Project for the business client
  • Used Web Publisher to automate publishing of content, reducing the normal user time to publish content on web
  • Performed Queue Management Job to clean the inbox of the users with the workflow tasks
  • Trained IT Managers in Webtop for the IT Process Project and designed handouts for the sessions
  • Documentation for the different projects which included the project description and scope, current process, object model, security model, Documentum configuration and additional database components required

Environment: Documentum Content Server, DA, DAB, Webtop, Desktop Client, Web Publisher, DQL, API, IAPI, IDQL, DFC, Workflow, Oracle, XML, HTML, Win NT/2000 server, Apache Tomcat, Unix, Captiva Input Accel



  • Developed the Web pages using Servlets and JSP
  • Understanding the Design Document
  • Developed user friendly screens using Applets and HTML
  • Client side validations using Java Script
  • Server side programming using Servlets
  • Involved in Testing of the application
  • Generated daily reports, monthly reports
  • Deployed applications to Tomcat server

Environment: Editplus, Java, JSP, Servlets, JDBC, Java Beans, HTML, Tomcat, SQL Server

Hire Now