We provide IT Staff Augmentation Services!

Lead/architect/administrator Resume

4.00/5 (Submit Your Rating)

Burlintonm, NC

SUMMARY

  • 18+ years of IT experience in software Analysis, Architect, Development, Implementation and production Support of various Client/Server and Web based applications.
  • 4+ years of experience in Hadoop (Cloudera Distribution) Administration.
  • Installation and configuration, Hadoop Cluster and Maintenance, Cluster Monitoring
  • Hands on experience in installing, configuring Hadoop ecosystem components like Hadoop, Map Reduce, YARN, HDFS, Oozie, Hive, Sentry, Impala, Zookeeper and Cloudera Manager.
  • Assisted in tuning teh performance of Hadoop eco system as well as monitoring
  • Installed Hadoop in high availability environment.
  • Assisted in Loaded data from various data sources into Hadoop HDFS/Hive Tables
  • Lead discussions with client leadership explaining architecture options and recommendations. Has both hands - on as well as has ability to define and explain complex concepts and solutions. As an architect and possess teh noledge of Data analysis, Data integration, Data modeling, Data warehousing and designing.
  • Experience working on High Availability and High Traffic applications.
  • Good understanding of Hadoop architecture and hands-on experience with Hadoop components such as Job Tracker, Task Tracker, Name Node, Secondary Name Node, Data Node, Map Reduce concepts and YARN architecture which includes Node manager, Resource manager and AppMaster.
  • Experience in Installation, configuration, setup and Administration of Infosphere Suite.
  • TEMPHas strong hands-on 10+ years of experience in using ETL and BI tools like Information Server/ Infosphere DataStage 11.5/11.3/91/8.7/8.5/8.1/7.5.2 and Hyperion (Essbase, HFM, MDM and FDM).
  • Extensive experience in technical architecture, data modeling, proof of concept, requirement study, system analysis, testing and development of decision support systems.
  • Experience in Planning, Installation and Administration of IBM MQSeries, Websphere Application server, and Weblogic on various platforms like UNIX, Linux, HP-Unix, Solaris, AIX and Windows NT/2000/2003.
  • Experience in working with RDBMS like, DB2, TERADATA, ORACLE and SQL Server.

TECHNICAL SKILLS

Hadoop Framework/BigData: HDFS, YARN, MapReduce, Hive, Oozie, Sentry, Impala, Zookeeper, CDH 5.x, YARN.

BI-DW Tools: IBM Infosphere suite 11.5/11.3/9.1/8.7/8.5/8.1/ Data Stage 7.5.2 and Hyperion System 9.3.1 & 11.x (Essbase, Financial Reporting and Planning).

OS Platforms: Linux/Unix/Solaris/AIX/HP-Unix and Windows 2012/ 2008/ 2003/ 2001/ NT-Server and Windows10/7/XP.

RDBMS: SQL Server, Oracle, DB2 and MySQL

Web/ Application Servers: IIS, Tomcat, Websphere, Weblogic and Apache.

Web Technologies: PERL, ASP, PHP, HTML, DHTML and Java Script.

PROFESSIONAL EXPERIENCE

Confidential, Burlintonm, NC

Lead/Architect/Administrator

Responsibilites:
  • Installation and configuration, Hadoop Cluster and Maintenance, Cluster Monitoring
  • Assisted in tuning teh performance of Hadoop eco system as well as monitoring
  • Assisted in Loaded data from various data sources into Hadoop HDFS/Hive Tables
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Installation, cluster upgrades, patching and routine maintenance for Hadoop cluster and eco systems.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Monitor Hadoop cluster job performance and capacity planning.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Provides day to day production support of our Hadoop infrastructure including:
  • Monitoring Hadoop cluster using tools like Cloudera Manager.
  • Prepare multi-cluster test harness to exercise teh system for performance and failover.
  • Create a complete processing engine, based on Cloudera’s distribution, enhanced to performance.
  • HDFS support and maintenance.
  • Adding and removing cluster nodes.

Environment: Red Hat 7.X/6.X, Hive, Oozie, Sentry, Impala, Zookeeper, CDH 5.12, YARN, Sitescope, MySQL, Cloudera Manager, JIRA tracking tool, HP Service Manager, Putty & WinScp

Confidential, Marlborough, MA

Lead/Architect/Administrator

Responsibilities:

  • Installation and configuration, Hadoop Cluster and Maintenance, Cluster Monitoring.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Handled importing of data from various data sources, performed transformations using Hive MapReduce, loaded data into HDFS and extracted data from MYSQL into HDFS vice-versa using Sqoop.
  • Assisted in tuning teh performance of Hadoop eco system as well as monitoring.
  • Experienced in administering Hadoop cluster and reviewing teh log files of all daemons.
  • Assisted in Loaded data from various data sources into Hadoop HDFS/Hive Tables
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Installation, cluster upgrades, patching and routine maintenance for Hadoop cluster and eco systems
  • Monitor Hadoop cluster job performance and capacity planning.
  • Provides day to day production support of our Hadoop infrastructure including:
  • Monitoring Hadoop cluster using tools like Cloudera Manager.
  • Prepare multi-cluster test harness to exercise teh system for performance and failover.
  • Create a complete processing engine, based on Cloudera’s distribution, enhanced to performance.
  • HDFS support and maintenance.
  • Adding and removing cluster nodes.

Environment: Information Server/ Datastage 11.3/8/7 Information Analyser, IMAM, IGC, AIX, WINDOWS, Citrix, DB2, Netezza, Oracle, SQL-Server, WASA Server, and Control-M.

Confidential

Lead/Architect/Administrator

Responsibilities:

  • Installation and configuration, Hadoop Cluster and Maintenance, Cluster Monitoring
  • Assisted in tuning teh performance of Hadoop eco system as well as monitoring
  • Assisted in Loaded data from various data sources into Hadoop HDFS/Hive Tables
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Installation, cluster upgrades, patching and routine maintenance for Hadoop cluster and eco systems
  • Monitor Hadoop cluster job performance and capacity planning.
  • Provides day to day production support of our Hadoop infrastructure including:
  • Monitoring Hadoop cluster using tools like Cloudera Manager.
  • Prepare multi-cluster test harness to exercise teh system for performance and failover.
  • Create a complete processing engine, based on Cloudera’s distribution, enhanced to performance.
  • HDFS support and maintenance.
  • Adding and removing cluster nodes.
  • Written Configuration files for Performance tuning in production environment.

Environment: Information Server/ Datastage 11.3/8/7/8.1/7.5.1, Information Analyser, AIX, Linux, WINDOWS, Citrix, DB2, Oracle, SQL-Server, WASA Server, and AutoSys

Confidential

Lead/Architect/Administrator

Responsibilities:
  • Installation and configuration, Hadoop Cluster and Maintenance, Cluster Monitoring
  • Assisted in tuning teh performance of Hadoop eco system as well as monitoring
  • Assisted in Loaded data from various data sources into Hadoop HDFS/Hive Tables
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Installation, cluster upgrades, patching and routine maintenance for Hadoop cluster and eco systems
  • Monitor Hadoop cluster job performance and capacity planning.
  • Provides day to day production support of our Hadoop infrastructure including:
  • Monitoring Hadoop cluster using tools like Cloudera Manager.
  • Prepare multi-cluster test harness to exercise teh system for performance and failover.
  • Create a complete processing engine, based on Cloudera’s distribution, enhanced to performance.
  • HDFS support and maintenance.
  • Adding and removing cluster nodes.

Environment: Red Hat Linux 5.X,6.X, Hive, Oozie, Sentry, Impala, Zookeeper, CDH 5.10, YARN, Sitescope, MySQL, Cloudera Manager, JIRA tracking tool, HP Service Manager, Putty & WinScpDataStage 9.1

Confidential, Peoria,IL

Lead/Architect/Administrator

Responsibilities:

  • Installation and configuration, Hadoop Cluster and Maintenance, Cluster Monitoring
  • Assisted in tuning teh performance of Hadoop eco system as well as monitoring
  • Assisted in Loaded data from various data sources into Hadoop HDFS/Hive Tables
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Installation, cluster upgrades, patching and routine maintenance for Hadoop cluster and eco systems
  • Monitor Hadoop cluster job performance and capacity planning.
  • Provides day to day production support of our Hadoop infrastructure including:
  • Monitoring Hadoop cluster using tools like Cloudera Manager.
  • Prepare multi-cluster test harness to exercise teh system for performance and failover.
  • Create a complete processing engine, based on Cloudera’s distribution, enhanced to performance.
  • HDFS support and maintenance.
  • Adding and removing cluster nodes.
  • Extensive data analysis on profiles (Machine, Dealer and customer) and purchase data.
  • Gathering Functional requirements from Operational, Dealers & Business Users and translate teh Functional requirements to Technical requirements & specifications.
  • IBM Information server installation and configuration on a distributed environment.
  • Create Data Architect technical specification documents for extraction, transformation, cleansing, and load (ETL) programs.
  • Responsibilities include installation and configuration of Information Server on DEV, QA, PPROD and PROD environments.
  • Migrating developed DataStage Jobs from DEV to QA and PPROD to PROD.
  • Attend Design Review Board (DRB) meetings to discuss teh data model changes in detail and attend to Technical Forum meetings to discuss Design/Implementation details of teh project, coordinate development activities with Offshore Development Team.
  • Installed Information Server/DataStage Clients on end user desktops/laptops/Citrix and troubleshooting daily issues.
  • Extensively worked with SAP stages (IDOC, BW and ABAP) and trouble shoot teh connection issue with SAP.
  • Prepared Documentation for Information server suite installation and configuration.
  • Problem determination and support during testing phases. (Application support and trouble shooting) and DataStage production support.
  • Trouble shooting Database connectivity problems and Created documentation for various administration activities.

Environment: DataStage 8.5, Information Analyser, Redhat Linux, Citrix, DB2, Oracle, SQL-server, MySQL, PERL, SAP, WASA Server, Tidal, MQ series and Windows 2008.

Confidential, San Francisco, CA

ETL Lead/Architect/Administrator

Responsibilities:

  • Participating in teh call of Business Requirements Designs from business users and work with team for shaping out teh Design of ETL tasks lists.
  • Involved in teh complete Design & Architecture of tables for ETL logic, written high level Designs & Detail Designs for ETL mappings as per business requirements. Attending teh designers call to clarify teh issues in teh designs to teh developers
  • Ensure teh work progressing to different environment as per schedule with quality by tracking teh status of team members’ work, co-coordinating with off-shore team and consolidating teh issues and halp teh team to understand and fix teh issues.
  • IBM Information server installation and configuration on a distributed environment and responsibilities include installation and configuration of Information Server on DEV, UAT and PROD environments.
  • Make sure developed DataStage Jobs migrate from DEV to UAT and UAT to PROD.
  • Involved in DR exercise, as part of DR we back up teh projects, scripts and various required software and restore them in teh designated server make teh whole environment available for DR testing purpose.
  • Prepared Documentation for Information server installation and configuration.
  • Problem determination and support during testing phases. (Application support and trouble shooting).
  • Developed Shell scripts to reset DataStage jobs and unlock teh DataStage jobs.
  • Involved in unit, performance, Regression and integration testing of DataStage jobs and prepared Documentation
  • Excessive usage of DataStage Director for monitoring Job logs to resolve issues.
  • Developed Shell scripts to automate file manipulation and data loading procedures.
  • Resolved issues occurred during regular testing.
  • Trouble shooting Database connectivity problems and Created documentation for various administration activities.

Environment: DataStage, Aix WebSphere Application server, MQ series, Oracle, SQL Server, DB2, Tivoli work scheduler and Windows 2003

Confidential, Fort Worth, TX

ETL Lead/ Architect/Administrator

Responsibilities:

  • Project Requirements Gathering, Requirements Analysis, Design, and Development, Testing for teh ETL, Data warehousing and reporting modules of teh project.
  • Analyzed data coming from teh staging area implemented in SQL Server and flat files “pushed” to us.
  • Designed teh ETL jobs using Ascential DataStage Enterprise Edition 7.5.2 to Extract, Transform and load teh data into teh Data warehouse hosted on Oracle 10g (Upgraded from 9i).
  • Fixed existing bugs in teh ETL code, SQL code used in extract jobs, other mapping errors. Data Stage installation and configuration on a distributed environment.
  • Responsibilities include installation and configuration of Data Stage on DEV, UAT and PROD environments.
  • DataStage PROD support and trouble shooting.
  • Involved in planning and implementation of DataStage
  • Involved in DR exercise, as part of DR we back up teh projects, scripts and various required software and restore them in teh designated server make teh whole environment available for DR testing purpose.
  • Prepared Documentation for DataStage installation and configuration.
  • Problem determination and support during testing phases. (Application support and trouble shooting).
  • Developed Shell scripts to reset DataStage jobs and unlock teh DataStage jobs.
  • Trouble shooting Database connectivity problems and Created documentation for various administration activities.

Environment: Data stage, Aix Java, Websphere Application server, Tivoli work Scheduler and Windows 2003

Confidential, Boca Raton, FL

Administrator/Developer

Responsibilities:

  • Interacted with business analysts to gather business user requirements.
  • Involved in designing and developing IBM Websphere Datastage ETL jobs.
  • Created dimensional models using Erwin.
  • Created physical designs for implementation on Oracle database.
  • Created generic Datastage sequence by integrating korn shell scripts to schedule ETL jobs.
  • Generated test environment and production environment migration documents.
  • IBM Information server installation and configuration on a distributed environment.
  • Responsibilities include installation and configuration of Information Server 8.1 on DEV, UAT and PROD environments.
  • Migrating developed DtatStage Jobs from DEV to UAT and UST to PROD.
  • DataStage PROD support and trouble shooting.
  • Involved in planning and implementation of Information Server 8.1
  • Installed Information Server/DataStage Clients on end user desktops/laptops and troubleshooting daily issues.
  • Information/Data Stage DR back up.
  • Prepared Documentation for Informations server8.1 installation and configuration.
  • Problem determination and support during testing phases. (Application support and trouble shooting).
  • Written Configuration files for Performance tuning in production environment.
  • Developed Shell scripts to reset DataStage jobs and unlock teh DataStage jobs.
  • Involved in unit, performance, Regression and integration testing of DataStage jobs and prepared Documentation
  • Excessive usage of DataStage Director for monitoring Job logs to resolve issues.
  • Developed Shell scripts to automate file manipulation and data loading procedures.
  • Resolved issues occurred during regular testing.
  • Trouble shooting Database connectivity problems and Created documentation for various administration activities.

Environment: BPEL PM, MQSeries, Java (JDK 1.2), XML, ilder, Websphere Application Server and Windows 2003.

Confidential, Concord, CA

MQ/Web Logic Administrator

Responsibilities:

  • Installation of Websphere MQ series on PROD, UAT, DEV and BCP systems.
  • Implementation of MQ Queue- Manager clustering across teh network.
  • Creating and customization of Queue managers on Solaris and HP-Unix.
  • Designed, Created and Configured Queue Managers, Queues, Process definitions, Channels.
  • Implementing teh Weblogic clustering between Weblogic managed server instances.
  • Deploying teh applications (JMS, EJB) on Weblogic server on different machines.
  • Design and integration of native MQ java and JMS application with database.
  • Systems integration using WebSphere MQ on distributed and mainframe.
  • Installation of WebSphere MQ 6.0 server and clients on distributed platforms (HP-Unix and Solaris)
  • Created documentation for various administration activities specific for SOA Engine.

Environment: MQSeries, Java, Websphere Application Server, Windows NT and UNIX/Solaris.

Confidential

Sr. Software Engineer

Responsibilities:

  • Data Stage 7.5.2 installation and configuration on a distributed environment.
  • Responsibilities include installation and configuration of Data Stage on DEV, UAT and PROD environments.
  • Migrating developed DtatStage Jobs from DEV to UAT and UST to PROD.
  • DataStage PROD support and trouble shooting.
  • Prepared Documentation for DataStage installation and configuration.
  • Problem determination and support during testing phases. (Application support and trouble shooting)
  • Trouble shooting Database connectivity problems and Created documentation for various administration activities.

Environment: Data stage, Sun Solaris - Sparc10 (64bit), Java (JDK 1.2), Weblogic Server, Tivoli, Windows 2003 and UNIX/Solaris.

We'd love your feedback!