We provide IT Staff Augmentation Services!

Big Data Solution Designer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Hands - on experience in setting up, evaluation of Big Data technologies
  • Proficient in customer engagement, relationship management, communication, problem resolution and team building
  • Adept at analyzing, testing and implementing physical database design to support system requirements relative to database, analyze transactions and data volumes.
  • Provide database technical administration and support, including processing change management requests, monitoring, performance tuning and database cloning.
  • Track record of conceptualizing and executing strategies to achieve key organizational targets
  • High-level leadership and mentoring ability. Excellent interpersonal and communication skills
  • Knowledge of international work cultures by virtue of overseas exposure in the US

TECHNICAL SKILLS:

Big Data Technologies: Hadoop, Pig, Hive, Sqoop, Impala, Phoenix

Cluster Management Tools: pdsh, Puppet

Operating System: Red Hat, CentOS, Ubuntu, AIX, Sun OS, HP-UX

RDBMS: Sybase, Oracle, SQL Server

Replication: Sybase Replication Server

Mainframe Connectivity: DirectConnect

Monitoring Tools: Nagios, Ganglia, Munin, BMC Patrol, SmartDBA

Back up & Recovery: SQL BackTrack, native tools

Modeling Tools: ERWin

GUI Tools for DBA: DBArtisan

UNIX Tools: Perl, awk, shell scripts

Report Writers: SQR, RPT

CAREER PROFILE:

Confidential

Big Data Solution Designer

Responsibilities:

  • Conduct research and competitive analysis associated with new opportunities
  • Understand client requirements documented in RFP’s
  • Develop creative proposals and value propositions customized to meet the business objectives of clients
  • Hardware estimation, effort estimation for proposals in the Telecommunication vertical
  • Coordinate with the finance teams on commercials

Technologies used: Hive, Spark, unix shell scripts on a HortonWorks platform. (Evaluated NiFi)

Confidential

Big Data Infrastructure Architect

Responsibilities:

  • Define and lead the realization of the data strategy roadmap including data modeling, implementation and data management
  • Research new technologies and approaches for presenting key business insights by analyzing Big Data
  • Mentor and train a team of 10 offshore resources to provide 24/7/365 support to a US based client involved in the business of fleet management
  • Act as a single point of contact to the customer for all day to day operations, issue resolution, change management, escalations etc.

Confidential

Architect

Responsibilities:

  • As part of automation initiatives, developed scripts to automate the installation and configuration of Hadoop on virtual machines thereby enabling the business users to focus on the business problems rather than the complexities of setting up Hadoop. Later, extended the same to implement on Amazon EC2 cloud.
  • Involved in the design & development of a POC for a leading insurance customer. The POC demonstrated actionable insights that can be generated from hitherto unused enterprise data sources (web logs) using Big Data technologies (Hadoop, Hive).
  • Involved in a POC that demonstrated a reduction in the time taken to process the data by implementing the same process using Big Data technologies. A leading insurance customer of Confidential, runs a forecasting process on Informatica /Oracle 11g every month that takes close to 21 days to complete. Identified a module, developed Map/Reduce scripts that incorporated the business logic similar to the current process, generated data to scale and ran it on a 5 node Hadoop cluster. The logic involved aggregation, sorting, lookups etc. In the current process, the time taken was 13 hours and on the 5 node cluster the time recorded was 2 hours. Through this proof of concept, successfully demonstrated that periodic, long running tasks can gain a lot of efficiency by exploring alternative mechanisms of processing and data stores such as Hadoop.
  • Designed & deployed a solution to launch Hadoop clusters on the cloud using Apache Whirr, integrated with monitoring solutions Nagios & Ganglia.
  • Develop & deliver several training programs on Big Data technologies to internal teams

Confidential

Database Architect

Responsibilities:

  • Functioned in a heterogeneous environment ranging from mainframes running DB2, middleware systems running Sybase, Oracle and UDB to NT servers running SQL Server across seven geographic locations for a major healthcare organization.
  • Accountable for keeping the OLTP and DSS production servers online and available in a Sybase environment as database administrator and part of the Georgia Database Services team apart from keeping the source (DB2) and destination (Sybase) systems in a replication environment in sync and resolving any issues.
  • Developed scripts and implemented SQL BackTrack for daily and weekly backups; performed upgrades and applied emergency bug fixes (EBFs) to production and test servers and monitored tools like BMC Patrol and SmartDBA on a regular basis for resolution of performance issues.
  • Migrated servers from HP platforms to AIX platforms during switchover and acquired knowledge on Oracle 8i.
  • Involved in migration of 2 projects involving Oracle instances and a disaster recovery project for all Oracle based systems in the Georgia region.
  • Focused on testing & evaluation of new database backup & recovery and performance monitoring tools apart from managing day to day DBA activities.

We'd love your feedback!