We provide IT Staff Augmentation Services!

Big Data Developer (consultant) Resume

0/5 (Submit Your Rating)

Herndon, VA

SUMMARY:

  • Big Data with with 12 + yr - including 9 yr in Data Warehousing, 3 yr of Big Data development using Spark, Hive, Python, R programming and Machine Learning Statistical algorithm related to Classification & Regression on Hortonworks, Cloudera, Linux VMs.

PROFESSIONAL EXPERIENCE:

Confidential -Herndon, VA

Big Data Developer (Consultant)

Responsibilities:

  • Analyze multiple sources of Data sources from the customer using Python, R. Write Python Dataframes code using Pandas, Numpy, Matplotlib for Apache Spark, scripts, Hive in Linux.
  • Use R, Python libraries on top of Spark to build histograms, Pi charts, correlation, linear and multiple regression and work with Solution Architect for building Machine Learning algorithms for Classification, Regression for network security data.
  • Run, Test & deploy code (Histogram, Pi Chart, Regression analysis) Hive, Pig scripts on IBM server for client feedback.
  • Clean, merge large volume data using unix, Python & initiate integration processes for data storage and analysis in fully virtualized linux environments.
  • Build Multiple Servers for Big Data applications using on Ubuntu, provision of Servers using Docker, Vargrant.
  • Proof of concept: Tableau R with Spark, Splunk, HBase integration to build Dashboards.

Confidential, Herndon, VA

Lead Data Analyst (Consultant)

Responsibilities:

  • Analyze Mortgage Data using SQL, Unix scripts. Based on Data Analysis discuss on Data migration plan with Business team, Architect, Developers in Informatica, Master Data Management, Reporting teams.
  • Analyze multiple Data sources from the customer using Python, R. Write Hive, scripts, Spark.
  • Test Run ETL code from RDBMS to check clean, transformed data load on server and check if ETL logic is as per requirements using SQL, TOAD, R programming..
  • Run SQL & Unix scripts to perform Data validation from Source, Staging, Data Mart, Business Objects Reports, data Projected vs Actual from Source to Target is as per the Business rules for data mapping, lineage in Informatica, MDM.
  • Work with Architect for proof of Concept for Big Data migration from RDBMS to HDFS using Hive, Pig, Flume scripts.

Confidential -Pittsburgh & Washington DC

Responsibilities:

  • Data Architect: Cloudera 5.4, Hortonworks 2.2 HDP for running SparkR, PySpark, Hive, Pig, H2O on Linux cluster. Data Cleaning, Transformation, Data Ingestion and Data Analytics.
  • Data Analysis using R programming, Python, Data migration from relational databases to Big Data using Sqool, Hive, Spark.

Confidential - Washington DC

System / Data Analyst

Responsibilities:

  • Data Analysis, preliminary for Data cleaning, transformation using using SQL, R programming. Based on source data, timing of incoming data from JPMC and others, build source to target mapping document.
  • Proof of Concept for Big Data implementation for Pipeline re-Engineering Reporting Data. Run Hive scripts to build database, Tables in HDFS, import export data between RDBMS Data warehouse using Scoop scripts, load Python data, data manipulation using Python.
  • Analyze and discuss Reporting data document with ETL (Informatica team), Reporting (Cognos team), DBA & Architect for production ready Source to Target mapping, Master Data document. SQL scripts to check data per business rules.
  • Data Analysis using SQL, TOAD, R programming, preliminary data cleaning, data transformation as per business rules for migration of data from PL SQL to Unix based Oracle system. Check root cause of variance in Cognos Reporting system, Unix system, Source database and document the causes of variance in data and errors.

Confidential -Pittsburgh, PA

Data Analyst, Enterprise Risk, Mortgage Loan Division (Consultant)

Responsibilities:

  • For Mortgage banking applications for commercial and retail banking - Procure data from source database (Mainframe developer), run Unix scripts, SQL scripts to migrate data from Mainframe system to Unix based Oracle system.
  • Data Analysis using R programming, SQL preliminary for Data cleaning, transformation using R. Work with business to finalize business rules. Check Source to Target Data mapping, Test Data in Staging work with ETL (Informatica) developers.
  • Tests Reports using Cognos, Business Objects along with developers, data quality test, report frequency, Ad Hoc vs Standard reports. Work with Cognos and Business Objects developers.
  • Proof of Concept document for Big Data implementation of Mortgage Data. Develop scripts for Hadoop Hive, import export of data between RDBMS Data warehouse - Hadoop HDFS.
  • Conduct walk through, get feedback, Document Sign Off with consent of all stakeholders.

Confidential - Boston, MA

Data Analyst, Trading Compliance (Consultant)

Responsibilities:

  • Test Pre-Trade compliance monitoring for credit risk, portfolio risk, Anti Money Laundering compliance prior to submission trade for a Trading Desk, involved in development of Watch list filtering, data warehouse for AML Alerts.
  • Run SQL scripts to Analyze Data during stages including Trade submission, release, execution, matching, trade posting. Use R to perform basis Statistical analysis, Data Cleaning, Analysis.
  • Run Unix scripts, Unix logs in Virtual environment to check Trade Data, Price margin errors in Unix logs vs Database, check XML, for Compliance

Confidential - MD

Functional Business / Data Analyst, Financial Compliance (Consultant)

Responsibilities:

  • Discuss & Document Blue print for System Integration - from fragmented asset data, compliance tools for watch list filtering, fraud case management, plan for Trade Surveillance, Transaction monitoring, positions, building risk profiles.
  • Procure Anti Model Laundering data Procure from source database (Mainframe, Oracle, Access, Sybase), run Unix scripts to migrate data from Unix to Oracle based system. Run SQL, TOAD to perform check on Source vs Test Data.
  • Analyzethe current data in multiple systems from Bridger, Charles River, AWD, Pershing for data migration.

Confidential - Bloomington, IL

Business System / Data Analyst (Consultant)

Responsibilities:

  • Document change management discussion in SCRUM meetings, weekly SPRINT, for new screens, prepare wireframes.
  • Run SQL scripts to update the Master Data Dictionary, Analyze source data for Source to Target.

Confidential - Baltimore, MD

Senior Data System Analyst (Consultant)

Responsibilities:

  • Work with Data Architect to document database design, develop data mapping, flow diagram, System migration guideline for Data Reporting. Run Oracle & Unix jobs on Linux VM to import data from Server to Client Machine.
  • Testing & Data analysis using Oracle, TOAD, error checking for reports data warehouse built on OBIEE.

Confidential - New York

Business Systems Analyst (Consultant)

Responsibilities:

  • Coordinate in project planning based on discussion with PM and with offshore development team.

Confidential - Weehawken, NJ

Data Analyst / Anti-Money Laundering (AML) Compliance (Consultant)

Responsibilities:

  • Worked on GEAR system (Global Environment for Accounting and Reporting) General Ledger & bookkeeping.
  • Requirement gathering with Business, Legal, Data, As-Is, To-Be for Actimize implementation, in corporate the feedback.
  • Functional requirement for Suspicious Activity report, Transaction monitoring, Watch List filtering using Actimize).
  • Performed data reporting (from multiple sources to data warehouse), WebI (Web Intelligence) - Web Based Actimize.
  • System Test on the customized Actimize based reports, analyze data and errors in report, data feeds check.
  • Web based applications like Trade Processing & Settlement, Client positions & balances, BASEL II compliance.
  • Performed data analysis using SQL, TOAD, checking reconciliation of Client positions on Centralized Data warehouse data vs old database, develop reports in Business Objects, like WebI (Web Intelligence), check for reported errors.

We'd love your feedback!