We provide IT Staff Augmentation Services!

Big Data Deployment Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Over 20+ years of IT experience with 3+ years as Hadoop Administrator/Big Data Administrator/Developer Lead/Big Data Architecture and Management using many different products such as: Cloudera Hadoop Big Data HDFS, SPARK, SOLR, YARN, OOZIE, Hive, Impala, Hbase, Hue, HDFS, Teradata, MS SQL Server, Oracle, PostgreSQL, Git and DB2 Databases
  • Manage 50 production and 15 development node Cloudera Hadoop clusters
  • Knowledge of Troubleshooting Core Java Applications
  • Deploy Hadoop cluster.
  • Add and remove nodes.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen Hadoop cluster job performances and capacity planning.
  • Monitor Hadoop cluster connectivity and security.
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Keep track of jobs.
  • Monitor critical parts of the cluster. configure name - node high availability.
  • Memory Allocation into JVM heap
  • Automate manual tasks
  • Software patches and upgrades
  • Disk space management
  • Performance monitoring and tuning
  • Database connectivity and security
  • Database backup and recovery
  • Point of Contact for Vendor escalation
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required
  • CHS Big Data Deployment Engineer code review and code deployment
  • Monitoring and troubleshooting of hardware and software Cloudera services
  • Install Packages and Parcel for Cloudera Cluster releases, system upgrades, patches, and evaluation of stability
  • Tuning of servers, applications and Cloudera Hadoop infrastructure
  • Work with users, engineering groups, and vendor support for troubleshooting
  • Keep up on emerging technologies relevant to our needs
  • Linux RedHat, CentOS administration and troubleshooting skills
  • Strong Hive, Impala, PostgreSQL, Oracle and SQL Programing and Performance tuning skills.
  • Expert in writing T-SQL and PL/SQL queries. Knowledge in tuning T-SQL (DDL and DML) queries to improve the database performance and availability.
  • Experience in RDBMS Concepts, Database Management Systems, and Database Physical and Logical design, Data Modeling, Data Warehouses.
  • Familiar with Data Warehousing designs in Star and Snow flake Schemas.
  • Expert in building various DTS packages with Script Task for file path and file validation.

TECHNICAL SKILLS

Languages: Java, Python, Go, Pig, Hive.hql, T-SQL, PL/SQL, .Net 2003, Visual Basic 6.0, C# and C++

Operating Systems: Red Hat Enterprise Linux Server release 6.7, Windows 2012/7/2003/2000/ XP/NT, UNIX, Sun Solaris

Networking: DNS, DHCP, TCP/IP, HTTP, web security mechanisms, proxies, firewalls, load balancers

Open Source: Apache, NGINX, RabbitMQ, Redis, Elasticsearch, Jetty

Databases: Cloudera Hadoop Big Data, Teradata, MS SQL Server, Oracle, MS Access

Database Tools: Cloudera Hadoop Big Data, Teradata, Db2, SQL Profiler, MongoDB, Query Analyzer, ODI, DTS, SSIS, SSRS, SSAS, Erwin, Visio, Toad, Embarcadero, ERStudio Data Architect 9.0

Reporting Services: IBM Cognos 10.2, Crystal Reporting, MS Reporting Services.

Web Technologies: HTML, XML, VB Script, ASP.net

UI: CRM ON Demand, Oracle JDE Edwards Enterprise one

Web Servers: IIS, Apache Web Server, Embarcadero Team Server.

PROFESSIONAL EXPERIENCE

Big Data Deployment Engineer

Confidential

Responsibilities:

  • Manage 50 production and 15 development node Cloudera Hadoop clusters
  • Developed Shell scripts for Deployment into Big Data Clusters
  • Monitoring and troubleshooting of hardware and software Cloudera services
  • Install Packages and Parcel for Cloudera Cluster releases, system upgrades, patches, and evaluation of stability
  • Linux/Red Hat administration and troubleshooting skills
  • Strong Shell, Hive, Impala, PostgreSQL, Oracle and SQL Programing and Performance tuning skills.

Confidential, Bellevue, WA

Oracle Developer

Responsibilities:

  • Developed ABC Schema, Packages and procedures for ABC module to support other modules to control errors tracking.
  • Developed Oracle Procedures, Packages to load Prepaid Subscriber and Feature data into new Oracle Adapter.
  • Developed Oracle Procedures, Packages to load Prepaid Subscriber and Feature data into new Oracle Adapter.
  • Developed Oracle Procedures, Packages to load CM Subscribers and Features data into new Oracle Adapter.
  • Developed Oracle Procedures, Packages to load CP Subscribers and Features data into new Oracle Adapter.

Confidential

SQL Server Developer

Responsibilities:

  • Gather requirements for new Data warehouse schema.
  • Developed report design specification documents.
  • Write MDX queries
  • Design Cubes
  • SSAS,SSRS,SSIS
  • Developed complex reports employing expressions, global collections, and conditional formatting using SQL Server Reporting Services
  • Interact with business users to determine report specifications
  • Capture changes to map new data warehouse.
  • Converts 150+ rpt file into rdl files.
  • Developed and Format Tablix with group expression and generate Template for rdl files.
  • Developed rdl file based on condition and expression.
  • Developed Data source view, cubes for Data Mart by SSAS.
  • Developed data sets, data sources to generate reports.
  • Deployed SSRS reports into Client server.
  • Configured SSRS report into client Server.
  • Upload manage users on Reports Manager.
  • Developed strategy to map raw data into sql server 2008 Database warehouse.
  • Developed sql scripts to generate new reports.
  • Modified stored procedures for new SSRS reports.
  • Developed SSIS for new warehouse load.
  • Developed Stored Procedures to map new data warehouse.
  • Deployed rdl files to Report Manager Server.
  • Configuration SSRS to Confidential Server.
  • SSRS Server configuration.
  • Deployed rpt Files to Microsoft Office SharePoint Server Templates(MOSS)
  • SSRS Deployment to Production Server.
  • Developed SSIS (ETL) to load Stage to Production.
  • Developed Configuration and Log files dynamic to run rdl, dtsx file to multiple environments.

Environment: Windows Server 2008 R2, Crystal Reports, MS Excel, MS VISIO, MS Visual Studio 2008, C#, SQL Server 2005,2008,BIDS.

We'd love your feedback!