We provide IT Staff Augmentation Services!

Big Data Architect Resume

0/5 (Submit Your Rating)

Mclean, VA

SUMMARY:

  • I have led Big Data initiatives for industry leaders and architect and code high - performance cloud-based data architectures. My main technology skills include Apache Spark/Hadoop/HDFS/Hive, Kafka, Cassandra and AWS. Seeking opportunities to code and design big data-intensive efforts.

TECHNICAL SKILLS:

Skills: Scala, Spark, Python, Java, PL/SQL, Bash, Web Services, Apex, Kafka, R, Cassandra, Hadoop, Redis, Memsql, HDFS, Hive, Oracle, SQL Server, Informatica, AWS (EMR, Redshift, RDS, Beanstalk), Docker, Git, Linux

WORK EXPERIENCE:

Big Data Architect

Confidential - McLean, VA

Responsibilities:

  • For Booz Allen, I taught monthly internal s for Big Data practice consultants on Hadoop, HDFS, Hive, and Spark including Java team engineers.
  • Supported business development efforts by serving as Big Data SME for meetings with client prospects and existing clients.

Tech Lead, Big Data

Confidential - McLean, VA

Responsibilities:

  • Co-Led team of 4 data engineers utilized by the bank for architecture and delivery of high-throughput PB transaction data pipelines across 6 Agile/Scrum internal business units.
  • Tech lead for design and implementation of real-time data streaming apps used by analysts company wide, developed a mix of Apache Apex, Spark, Scala, Java, Oracle, AWS.
  • Architect for ETL legacy data integration related to past Confidential acquisitions / partnerships.
  • Member of the Big Data “Emerging Tech” advisory committee responsible for performing due diligence on new tech and evaluating vendors before upgrade/implementation.
  • Coded two CEO proof-of-concepts for Apache Flink and Avro for Confidential ’s shareholders meeting.
  • Consulted to directors and VPs to analyze warehouse process and define best practices.

Senior Consultant, Big Data/”Smart Analytics

Confidential - NY, NY

Responsibilities:

  • Member of the Big Data practice focused on clients in financial services and insurance.
  • Big Data developer responsible for SME on customer engagement, agent performance, website quote performance. Developed prototypes using Hadoop/Spark/Hive/HDFS and expanded several into production apps.

Data Architect

Confidential - NY, NY

Responsibilities:

  • Architected real-time data solutions for Confidential Markets trading platform that stored metrics 100 million daily transactions. As of 2016 my code runs 12 billion transactions on nightly job.
  • Developed a library of reusable PL/SQL components for shared use by apps across different divisions.
  • Performed Oracle 10g database tuning and repartitioning for mortgage division.

Database Developer

Confidential - Bethesda, MD

Responsibilities:

  • Created ETL mappings and processes to improve performance for the Toxic Releases Inventory (TRI) database used by the EPA to track pollutants after Confidential . Team member on full database rearchitecture which improved report generation from hours to minutes.
  • Developed ETL processes for a data warehouse integration for a federal intelligence client.
  • Implemented Java components to integrate with Informatica at a public health client. Misc other projects.

We'd love your feedback!