We provide IT Staff Augmentation Services!

Big Data Analyst Resume

4.00/5 (Submit Your Rating)

Alexandria, VA

SUMMARY

  • 10+ yearsof experience wifOracle PL/SQL, UNIX,Hadoop/Big dataand SparkTechnologies. Out of whichseven years'of Solid experience inBankingDomain. Highly organized and efficient in fast - paced multitasking environments; Strong ability in prioritizing effectively to accomplish objectives wif commitment and enthusiasm
  • Extensive work experience wifRDBMSand Databases likeOracle
  • Best working experience of Big Data Technologies likeHadoop,Hive,Map-Reduce, YARN,Spark, RDD, Datasets, Streaming
  • Well trained in Niche technologies likeApacheSparkSQLforFast Data Processing
  • Impressive expertise in writing Hadoop Jobs for analyzing data usingHiveQuery Language
  • Solid experience wif variousHiveperformance tuning techniques likePartitioningandBucketing.
  • Experience in importing and exporting data betweenHDFSand Relational DatabaseManagement systems usingSqoop
  • Strong Experience in Development, Maintenance and support phases
  • Highly proficient in writing Database related scripts likeSQL,PL/SQLScripts which includeStored Procedures, Functions, Triggers, Packages, Indexes
  • Exceptional skills in Automating repetitive tasks usingUNIXBash Shell Scripting, Python,AutoSys, Job Scheduler in givenLINUXenvironment
  • Solid Experience in Investment Banking, Telecom and Government Sector Domains
  • Profound in performance tuning of databaseSQLScripts
  • Excellent ability on providing innovative solutions to business needs.
  • Strong judgment and decision-making abilities

TECHNICAL SKILLS

Big Data: Hadoop, Hive, Sqoop, Apache Spark, Fast Data, Map Reduce

Operating Systems: WindowsServer, UNIX, LINUX, Solaris, Mainframes

Data bases: Oracle 12c/11g, DB2, Sybase

Programming languages: SQL,PL/SQL,UNIX,Python,AutoSysjob Scheduler, XML

IDE Tools: Toad, SQL Developer,SQL *Plus, Sublime Text

Others: ArcGIS, Clarity, Informatica,BusinessObjects,IBM MQ,Splunk

PROFESSIONAL EXPERIENCE

Confidential, Alexandria,VA

Big Data Analyst

Responsibilities:

  • Performed analysis of vast data stores and uncovered insights of teh data
  • Loaded data from Database into Hive systems
  • Implemented High-speed querying using Spark SQL, Hive, PL/SQL and SQL
  • Translated complex functional and technical requirements into detailed design.
  • Efficiently utilised PySpark to create datasets for Data Analysis
  • Focussed on performance tuning of Hive/Spark SQL for faster results
  • Lead teh phase of providing data insights using Hive, Spark, DB, UNIX to Business

Technical Environment:Hadoop,Hive, Spark SQL, Sqoop, Oracle 11g, PL/SQL,Python, UNIX

Confidential

Hadoop Developer

Responsibilities:

  • Highly involved in creating Hive tables, loading data, writingHivequeries, generatingpartitionsandbucketsfor optimization.
  • Importing teh data from Oracle into theHDFSusingSqoop. Performed full and incremental imports usingSqoopjobs.
  • Responsible to manage data coming from various sources and involved inHDFS
  • Fine-tune thePySparkcodes for teh optimized utilization ofHadoopresources for teh production run.
  • Built real time pipeline for streaming data usingSpark Streaming
  • Extensive Experience in writingOracle PL/SQL & SQL DB queriesand optimizing them.
  • Strong Focus on development of stored procedures, functions and SQL scripts for multiple Applications in teh project
  • High level of Performance Tuning was done by invoking Query optimization Techniques
  • Constructively involved in data modeling of several applications
  • Extensively utilized Explain Plan, Oracle hints and creation of new indexes to improve teh performance of SQL statements
  • Flexible in providing workday and after-hours production support for applications.
  • Automating repeated tasks using Python and UNIX Bash Scripting

Technical Environment:HadoopHortonWorks,Sqoop,Hive,Map Reduce,Apache Spark SQL,Oracle 11g, PL/SQL, SQL,Python, UNIX

Confidential

Senior Data Analyst

Responsibilities:

  • EnsuredIn depth Data Analysisof all teh Business requests raised
  • Automated Data retrieval involvingcomplex systems such as Mainframes, DB2andUNIXusingBash Shell Scripting
  • Highest Standards were ensured for providing requesteddatafor Bank Operation Users
  • Efficientlyfine tunedthe query before running teh job in Mainframes as it involves cost for each query run
  • Best Analysis provided for theVendor data feeds like Broadridge, Omgeoto teh Bankandtheir internaldata flowwifin Bank Production Applications
  • Effectively managedthe risksto ensure teh security and resiliency of teh application are in compliance wif firm and regulatory requirements
  • Complete ownershipof thetasksassigned untiltimelycompletion

Technical Environment:Oracle 11g, PL/SQL, SQL, UNIX,Python,AutoSys, DB2, Mainframes, Toad,Service now, Jenkins, SVN, GitHub, IBM MQ, XML

Confidential

Senior Analyst

Responsibilities:

  • Generating SQL Scripts for Database Objects like Sequences, Views and Indexes.
  • Working in Clarity Tool to interact wif teh Database
  • Efficiently worked onTelecom Datainvolving UNIX and provided high level reports
  • Working in Unix Environment to interact wif teh Physical Telephone Switches.
  • Working on Oracle Database to ensure teh proper re-connection and Disconnection of Telephone Numbers.
  • Participatedin Business Requirements and Functional Requirements meetings identify gaps in requirements and drive discussion around appropriate solutions.

Technical Environment:Oracle 10g/9i, PL/SQL, SQL, PL/SQL Developer, Clarity ToolClient-State Government of Michigan, Lansing USA Jun. 2008 - May 2009

Confidential

Software Analyst

Responsibilities:

  • Extensively developednew features wif analysis, coding and testing in Oracle PL/SQL.
  • Developed PL/SQL functions, procedures, triggers,and packages.
  • Created database Tables, Views, Sequences in Development and Production environment
  • Used Exception Handling extensively for teh ease of debugging and displaying teh error messages in teh application.
  • Imported data through various environments by using SQL * Loader.
  • Created IN and out bound flat files like CSV, TXT and exported to another database.

Technical Environment:Oracle 11g/9i, PL/SQL, SQL, SQL *Loader, SQL Developer, SQL *PLUS, UNIX, Tortoise SVN

Confidential

Junior Analyst

Responsibilities:

  • Worked onGarmin GPS Base map Creation
  • WorkedCorps IENC Geo PDF Creation
  • Low cost Hardware solution for USPS in-field error checking
  • Worked for teh development of validation tool
  • Worked on Arc GIS tools
  • Worked on GPS map creation
  • Worked on Python Scripts
  • Worked for teh maintenance of office employee database

Technical Environment:Arc Info, Arc Editor, Arc View,Python andArc Reader

We'd love your feedback!