We provide IT Staff Augmentation Services!

Architect Vancouver Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • A collaborative engineering professional with substantial experience designing and executing solutions for complex business problems involvinglarge scale applications, real - time analytics and reporting solutions.Incorporated an intuitive architecture that helps organizations effectively analyze and process terabytes of structured and unstructured data.
  • Architected a social engagement application in a short span of time that would replace current commenting application with NoSQL DynamoDB as backend
  • Produced comprehensive architecture strategy for environment mapping in AWS that AIM Application, Provide a comprehensive strategy using AWS Identity and Access Management (IAM) Role for community platform systems and successfully implemented the same setting precedence for other teams to follow suit
  • Proven history of building large-scale data processing systems and serving as an expert in big data solutions while working with a variety of database technologies.
  • Experience architecting highly scalable, distributed systems using different open source tools as well as designing and optimizing large, multi-terabyte data warehouses.
  • Able to integrate state-of-the-art Big Data technologies into the overall architecture and lead a team of developers through the construction, testing and implementation phase

PROFESSIONAL EXPERIENCE

Confidential

Architect Vancouver

Responsibilities:

  • Responsible for all facets of project management as it relates to strategic and tactical direction of Architecture Design, implementations. Used agile, interfaced and provided updates to Team.
  • Architect solutions for key business initiatives ensuring alignment with future state analytics architecture vision
  • Work closely with the project teams as outlined in the Agile methodology providing guidance in implementing solutions at various stages of projects
  • Adopt innovative architectural approaches to leverage in-house data integration capabilities consistent with architectural goals of the enterprise
  • Responsible for the end-to-end architecture of a Delivery, including its assembly and integration into the IT architecture principles defined with the client
  • Define the structure of the system, its interfaces, and the principles that guide its organization, software design and implementation
  • Responsible for the management and mitigation of technical risks, ensuring that the Delivery services can be realistically delivered by the underlying technology components
  • Involving in Design the Architecture of the project from scratch and integrating components
  • Both product owners and respective organizations. Breath of implementation included the following
  • Estimations, identifying the risks/issues and forecasting potential POC requirements for incompatible
  • Components from Legacy source. Participating in weekly conference sessions with business analysts
  • Creating & configuring Gitlab repo,groups,projects,Members. Gitlab CICD Pipelines Testing and deployment automation
  • Create and secure snowflake account and creating security privileges.
  • Connecting and Integrating with Snowflake Database. to Looker Dashboards Visualization, analytics
  • Kubernetes cluster to host Airflow deployments, and related network infrastructure to properly operate it inside the VPC
  • GCP Astronomer Product that enables many DevOps functions to centrally support, manage, and maintain Airflow deployments in Cloud. Includes additional services such as Grafana operational dashaboards, tied into prometheus for monitoring
  • Designed,developed,tested and maintained Tableau functional reports based on user requireents. converted existing BO reports to Tableau dashboards
  • Created different KPI using calculated key figures and paramaeters. utilised Tableau server to publish and share the reports with Business users
  • Involved in administration tasks such as setting permissions,managing ownerships and providing access to the users and adding them to the specific group
  • Used Spark-Streaming APIs to perform necessary transformations and actions on the fly for building the common learner data model which gets the data from Kafka in near real time and Persists into Cassandra.
  • Configured deployed and maintained multi-node Dev and Test Kafka Clusters.
  • Developed Spark scripts by using Scala shell commands as per the requirement.
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Developed Scala scripts, UDFFs using both Data frames/SQL/Data sets and RDD/MapReduce in Spark 1.6 for Data Aggregation, queries and writing data back into OLTP system through Sqoop.

Confidential

Senior Consultant

Responsibilities:

  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
  • Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
  • Implemented ELK (Elastic Search, Log stash, Kibana) stack to collect and analyze the logs produced by the spark cluster.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark.
  • Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
  • Managed team of 6+ FTEs. Responsible for all facets of project management as it relates to strategic and tactical direction of implementations. Used agile, interfaced and provided updates to
  • Both product owners and respective organizations. Breath of implementation included the following functionality: Risk Mart, Credit Mart, Finance Mart and FC etc. Performed effort and time estimations, identifying the risks/issues and forecasting potential POC requirements for incompatible components from Legacy source. Participating in weekly conference sessions with business analysts and high-level architects to report project updates. Used Jira and confluence for tracking the implementation and bugs. Responsible for data ingestion from RDBMS to Hadoop using Sqoop, automating the workflow using Oozie and performing data cleansing, transformations and using spark and hive. Resolving performance issues in Spark and Hive with understanding of Spark physical plan execution and used debugging to run code in optimized way. Managed overall solution testing and defined balanced scorecard for tracking testing progress.
  • Designed,developed,tested and maintained Tableau functional reports based on user requireents. converted existing BO reports to Tableau dashboards
  • Created different KPI using calculated key figures and paramaeters. utilised Tableau server to publish and share the reports with Business users
  • Involved in administration tasks such as setting permissions,managing ownerships and providing access to the users and adding them to the specific group

Confidential

Data Quality

Responsibilities:

  • Analyze Descriptive analysis of data based on labeled features (Categorical data, Numerical data, Ordinal, Continuous, Interventions
  • Use R and Python to implement vectors, lists, data frame, matrices, arrays, reading data from various sources like CSV, MS Excel, notepad, xml, data base Box plots, histograms, numeric and categorical variables)
  • Analyze Market basket analysis to see which products go together so that the sales people can implement the strategy of the selling the less selling products with other products.
  • Analysis of features distribution by drawing histograms, box plots
  • Knowledge in various models to study accuracy of data and relation
  • Analyze models like Regression, GLM, Random forest, SVM, CHAID
  • Model Comparison for accuracy
  • Development of dashboard using tableau for easy interpretation

Confidential

Senior Consultant

Responsibilities:

  • Coordinated with technical teams for installation of Hadoop and third-party applications
  • Formulated procedures for planning and execution of system upgrades for all existing Hadoop clusters
  • Provided technical assistance for configuration, administration and monitoring of Hadoop clusters
  • Participated in evaluation and selection of new technologies to support efficiency
  • Provided operational support services relating to Hadoop Infrastructure and application installations
  • Responsible for coordinating with the Business Analysts and users to understand business and functional requirements & Implement the same into an ETL design documents
  • Worked on Informatica client tools such as Repository manager, Designer, Workflow Manager, Workflow Monitor and designer client tools Source Analyzer, target developer, Transformation Developer, Mapping designer and Maplet designer
  • Developed various Informatica Mappings & Maplets to load data from source to staging and loading dimension keys and fact tables using different transformations

Associate Software Engineer

Confidential

Responsibilities:

  • Performed Manual and Automation testing using Selenium Framework
  • Writing test cases to verify the functionality of application based on requirements
  • Involved in preparation of Functional Test Scripts using Automation tools
  • Preparation of Test case execution Reports and Defect Reports using Quality center
  • Worked extensively with slowly changing dimensions.

We'd love your feedback!