We provide IT Staff Augmentation Services!

Big Data Architect Resume

2.00/5 (Submit Your Rating)

Malvern, PA

SUMMARY

  • Over 13 years of experience in Product development and Big Data applications development using Hadoop, AWS components, Oracle Cloud.
  • Experience designing and developing applications on AWS, Azure, HortonWorks platform and their eco systems components.
  • Good hands - on experience in Big data technologies - Hadoop MapReduce, Spark, HDFS, Hive, Sqoop, Kafka.
  • Good hands-on experience on AWS components like Redshift, EMR, S3 and EC2.
  • Experience working on configuration management and performance management of Cloud Infrastructure.
  • Develop complex ETL transformations using Python and Spark in Cloud environments.
  • Extensive experience in data ingestion, big data storage planning, complex transformations, data integration, analysis for Retail sector (McDonalds Corporation).
  • Thrives on challenge and works well under pressure, with technical expertise to learn new environments quickly, locate inefficiencies in code, and provide quick solutions.
  • Proven ability to work with minimal requirements and turn them into crisp deliverables.
  • Proven ability to work on technologies with minimal documentation.
  • Good domain knowledge of Investment banking, Retail and HCM.

TECHNICAL SKILLS

Operating systems: Linux, UNIX, Windows, OS X.

Programming: Java, Python, Scala

Big data Technologies: AWS, Spark, Kafka, Snowflake Computing Hive, Sqoop, Redshift, EMR, Map reduce and related Hadoop eco system components

PROFESSIONAL EXPERIENCE

Confidential - Malvern, PA

Big Data Architect

Responsibilities:

  • Job duties involved the design, development of Dashboards for Sales Leadership team of Confidential Institutional Investor Group
  • Gather the requirements from the client and articulate them and create requirement documents.
  • Ingested data from Siebel and Dynamics to Confidential Data Lake on AWS.
  • Work on configuration management and troubleshoot performance issues.
  • Configure CICD using Bitbucket production environments of client.
  • Used PySpark, Spark SQL, EMR, S3 and related AWS technologies to ingest the data to hive tables.
  • Identify the production issues from existing and previous phases to provide fixes.

Confidential, Seattle WA

Big Data Architect

Responsibilities:

  • Job duties involved the design, development of POCs for ticketing platform for Confidential World Resorts in AWS, Azure cloud platforms using Spark, Python,Scala, Sqoop and Oozie and MKS.
  • Gather the requirements from the client and articulate them and create requirement documents.
  • Ingested data from VGS data base on Microsoft Azure to Disney Data Lake on AWS through Kafka and Spark.
  • Work on configuration management and troubleshoot performance issues.
  • Configure CICD using gitlab, github for production environments of client.
  • Used Spark streaming, Sqoop, Oozie to ingest the data to hive tables for the data
  • Created process flows for data movement from raw zone to curation zone and to User zone.
  • Identify the production issues from existing and previous phases to provide fixes.
  • Ingest data from multiple data sources for Disney Confidential Adsales data warehousing in Snowflake using Airflow(Python)
  • Data cleansing and processing using PySpark and Python.
  • Follow Agile practices for software development
  • Monitor and direct the workflow of smaller consulting projects or segments of larger projects.

Confidential - Marlborough MA

Big Data Architect

Responsibilities:

  • Job duties involved the design, development of various modules in AWS using Spark, Scala, Hive, Sqoop and Oozie and Kafka.
  • Gather the requirements from the client through email and discussion and articulate them and create requirement documents.
  • Ingested data from Axeda(Kafka), EBS and Salesforce to Confidential Data Lake.
  • Used Spark streaming, Sqoop, Oozie to ingest the data to hive tables for the data related to IOT devices(Axeda)
  • Involved the design, development of Curation zone, User zone for Confidential Data Lake.
  • Owned the modules of requirement gathering, Design and Development.
  • Created process flows for data movement from raw zone to curation zone and to User zone.
  • Created Qlik reports on top of the user zone data to facilitate the Field services team understand the use cases of Axeda Alarms
  • Created user zone tables of which combine the summary data from Oracle Install base, Case data from salesforce and Device data from Axeda platforms.
  • Identify the production issues from existing and previous phases to provide fixes.
  • Responsible for mentoring Peers and leading technical teams.

Confidential - Chicago IL

Big Data Architect & AWS specialist

Responsibilities:

  • Job duties involved the design, development of various modules in AWS, Hortonworks Big Data Platform and processing daily transaction data of restaurants using Map Reduce, Hive, Sqoop and Oozie.
  • Ingested huge amount of XML files into Hadoop by Utilizing DOM Parsers with in Map Reduce. Extracted Daily Sales, Hourly Sales and Product Mix of the items sold in McDonalds Restaurant’s and loaded them into Global Data Warehouse.
  • Written and tested the Map/Reduce code to do aggregations on identified and validated data. Processed Mobile Offers data for restaurants in USA across various locations and ingested the data into Hadoop HIVE tables.
  • Designed and developed Application to ingest data from Social network platforms Sprinklr to perform analysis over trends of different restaurants social network platforms in AWS RDS and Spark.
  • Involved in client meetings for requirements to ingest the data for restaurants in APMEA, Spain and Canada markets into Global data warehouse.
  • Involved in design and development Amazon Redshift tables for Customer 360 analysis of Restaurants’ Customers.
  • Worked on migrating Hadoop components to AWS environment.
  • Involved in Setting up and Managing training sessions for Amazon web services components.
  • Responsible for mentoring Peers and leading technical teams.

Confidential, Chicago IL

Big data consultant

Responsibilities:

  • Primary responsibilities include design, development of various modules in Hortonworks Big Data Platform and processing daily transaction data of restaurants using Map Reduce, Hive, Scoop and Oozie.
  • Design, developed and tested Map Reduce programs on Mobile Offers Redemptions and Send it to the downstream applications.
  • Written and tested the Map/Reduce code to do aggregations on identified and validated data.
  • Scheduled multiple Map Reduce jobs in Oozie. Involved in extracting the promotions data for McDonalds stores within USA by writing the map reduce jobs and automating it with UNIX shell script.
  • Designed, developed and document the data flow for clients to handle the data in an efficient way.
  • Troubleshoot the issues in environments and fix them with in the timelines specified.
  • Responsible for developing customer relationship through deep engagement and delivering continuous value by meeting customer expectations and handling day to day issues
  • Documented all the projects in a fashion that can be used by first time users and develop the prototypes into customer projects

Confidential

Senior Applications Engineer

Responsibilities:

  • A member of Fusion HCM Architecture team, designed and developed next generation prototype applications like Ask Fusion and Report Discovery
  • Designed and developed data flows using big data components to handle data from the cloud environment and derive value deliverables.
  • Worked on Big data tools to analyze patterns in customer activity in could environments.
  • Researched new UI components in the Fusion Tech stack for UI improvement project and presented the functionality to over 5 internal teams and over 50 end users
  • Designed, developed and automated the flow to handle wiring of customer environments in Python. Handled one of the complex third party integrations in HCM domain, over 50 customers successfully implemented the functionality
  • Maintained environments for Fusion- Taleo Integration for Strategy and Oracle OpenWorld teams, which involves setting up the environments, applying middleware patches, securing the applications and wiring the two environments
  • Responsible for developing customer relationship through deep engagement and delivering continuous value by meeting customer expectations and handling day to day issues
  • Documented all the projects in a fashion that can be used by first time users.

Confidential

Software Engineer

Responsibilities:

  • Developed modules of Futures and Options, Equities, SWIFT (Society for Worldwide Interbank Financial Telecommunication) for Finacle Wealth Management Solution.
  • Designed a unique process flow to handle SWIFT code and made it easy for future enhancements and support.
  • Supported the customers for priority issues and took part in knowledge transfer sessions to internal teams.

We'd love your feedback!