We provide IT Staff Augmentation Services!

Data Architect Resume

Sanfrancisco, CA

SUMMARY:

  • Accomplished and integrity - driven Data Architect highly regarded for utilizing technical skills to support and grow business objectives for internationally recognized institutions.
  • Respected as an expert data strategist with complete data lifecycle, CRISP-DM concepts, SDLC, STLC, and SVLC methodologies.
  • Experienced in parametric and non-parametric statistical analysis of business data.
  • Exemplary academic qualifications, including a Bachelor of Science in Computer Engineering and an SAP Certified Application Associate.
  • Track record of success in progressive IT roles in predictive/statistical modeling, big data engineering, business intelligence, and data architecture.
  • An out-of-the-box thinker who utilizes innovative qualitative/quantitative analysis capabilities to achieve growth and exceed all expectations.
  • Developed implemented data strategy and its components for the corporate
  • Earned a reputation as a motivational and influential project manager by coordinating, directing and implementing OLTP, ODS, data warehouse and Business Intelligence projects/applications.
  • Utilized predictive modeling algorithms (SVM, KNN, RF, decision tree, naive Bayes and logit regression), and demonstrated skill in architectures (data architecture, MDM, metadata architecture, relational and dimensional modeling).
  • Improved data access and handling abilities in a collaboration project with database engineers and other scientists to implement new, enhanced queries (Microsoft SQL server and MongoDB).
  • Presented data analytics reports in interactive 3D visualized format for easy understanding and interpretation using R packages (plotly, ggplot2, r markdown) and tableau embedment.
  • Recommended changes in development, maintenance, and system standards to senior management and key stakeholders.

CORE COMPETENCIES:

  • Statistical Modeling
  • Data Modeling
  • Machine Learning
  • Data Analysis
  • Master Data Management
  • Azure /GCP Cloud
  • Big Data Technologies
  • Interpersonal Skills
  • Data Warehouse
  • Data Cleansing
  • Data Harvesting
  • Data Wrangling
  • Reports/ Dashboards Creation
  • BI Tools
  • Project Management

TECHNICAL EXPERTISE:

Scripting /programming language: R, Python, SQL

Machine learning/Deep learning: Classification, Regression and Clustering analyses using neuralnets (MLP), RF, KNN, SVM, GLM, MLR, Logit, K-means algorithms

Database management systems: RDBMS (Microsoft SQL server, OracleDB, Teradata, SAP HANA, MongoDB)

Data storage/ processing framework: HADOOP and SPARK

Data visualization/reporting: Tableau, and shiny

Case Tools: Erwin & ERStudio

PROFESSIONAL EXPERIENCE:

Confidential, Sanfrancisco, CA

Data Architect

Responsibilities:

  • Developed data strategy framework, policies and procedures along with Mater Data Management and metadata management
  • Lead and Implemented the data governance and privacy policies & procedures for Confidential specifically for contract implementation
  • Developed accountability procedures governing data access, processing, storage, retention, reporting and auditing measuring contract compliance
  • Developed data stewardship program & established the metadata registry
  • Developed data lake architecture, lambda architecture using Hadoop bigdata eco systems tools and utilities.
  • Developed Conceptual/Logical data model for Customer 360
  • Lead a POC for churn model

Confidential, San Jose, CA

Data Architect

Responsibilities:

  • Developed the ODS data architecture for Marketing and Sales subject area
  • Gathered the Business data by conducting the JAD sessions with the Business Stake Holders.
  • Developed a relational model sourcing from multiple sources (Conceptual, Logical, Physical models)
  • Reviewed, confirmed the dimensional modeling standards, naming conventions, modeling methodologies.
  • Identified confirmed dimensions and facts across multiple subject areas.
  • Created the dimensional model.
  • Created ELDM to Dim mapping document for ETL loading and Lead the ETL team

Confidential, Herndon, VA

Data Architect

Responsibilities:

  • Provided the client with data harvesting, mining, query, wrangling, and reporting & visualization.
  • R packages (Caret, H20.ai, Keras, RODBC, R Oracle DB, rvest, plotly, ggplot2, Base R etc.), Python packages (Scikit learn, scraPY).
  • Identified and implemented scripts to collect new data and builds large, complex data (TB) using python & R libraries and packages, such as ScraPY, Beautiful Soup, rvest .
  • Built recommendation systems and predictive models using sci-kit learn (python) and caret(R) packages on e-commerce historical data (10 years).
  • Data was sourced from Hadoop Data Lake for modeling and analytic purpose. Data flows through Spark stream for applying the probability models using SparkR.
  • Mapped and converted data to new formats using existing framework (csv, txt, sav, jpeg, png files).
  • Presented data analytics reports in interactive 3D visualized format for easy understanding and interpretation using R packages (plotly, ggplot2, markdown ) and tableau embedment.
  • Used R functions to implement component analysis (PCA) for dimension reduction

Confidential, Santa Clara, CA

Data Architect

Responsibilities:

  • Provided the client with data harvesting, mining, query, wrangling, and reporting & visualization.
  • R packages (Caret, H20.ai, Keras, RODBC, R Oracle DB, rvest, plotly, ggplot2, Base R etc.), Python packages (Scikit learn, scraPY).
  • Identified and implemented scripts to collect new data and builds large, complex data (TB) using python & R libraries and packages, such as ScraPY, Beautiful Soup, rvest .
  • Built recommendation systems and predictive models using sci-kit learn (python) and caret(R) packages on e-commerce historical data (10 years).
  • Data was sourced from Hadoop Data Lake for modeling and analytic purpose. Data flows through Spark stream for applying the probability models using SparkR.
  • Mapped and converted data to new formats using existing framework (csv, txt, sav, jpeg, png files).
  • Presented data analytics reports in interactive 3D visualized format for easy understanding and interpretation using R packages (plotly, ggplot2, markdown ) and tableau embedment.
  • Used R functions to implement component analysis (PCA) for dimension reduction.

Confidential, Mountain View, CA

Data Architect

Responsibilities:

  • Documented the business and data requirements, developed As-is and Target state architecture, performed gap analysis, developed roadmap for issue resolution
  • Developed Enterprise data warehouse architecture (Hybrid), combined with master data architecture, governance framework.
  • Created conceptual, logical and physical data models in 3rd normal form to support the EDW.
  • Developed star and snowflake models to support the business intelligence and dashboard requirements.
  • Developed data lake architecture with Hadoop storage and spark processing engine
  • Developed lambda architecture for batch and streaming data/reporting

Confidential, Columbus, GA

Data Warehouse Architect

Responsibilities:

  • Developed data mart architecture with metadata architecture.
  • Completed STAR schema/snowflake (conceptual, Logical and Physical data models (using the tool Erwin 9.X)) for the Data Mart, Fact less- Fact table.

Confidential, San Jose, CA

BI Data Architect

Responsibilities:

  • Designed BI architecture for multiple business groups, working collaboratively to understand needs and builds solutions.
  • Created star and snowflake data models that supported the dashboard analytical reports, and operational reports.
  • Developed architecture for in memory technologies like SAP HANA, HBase, and MongoDB.
  • Provided OLAP services and intelligent cube design for advanced analytics and Trend analysis reporting.

Confidential, San Jose, CA

BI Architect

Responsibilities:

  • Developed reports for various departments in the company, mapping the reporting requirements from business terms into Business Objects.
  • Developed business object universe and full thin client reports for finance, marketing, and sales.
  • Extracted Level 0 data from Essbase 6.5.1 cube to star schema database in Oracle 8i, and developed Dashboard reports in Business Objects for billing, booking, backlog, sales unit analysis and inventory.
  • Maintained, monitored, and optimized Scripts for extraction of ROW and EMEA data into data warehouse merged and populated into Essbase Cubes, scheduling the Essbase Load rules and calculation using an in-house developed tool.

Confidential, Minnetonka, MN

Reporting Analyst

Responsibilities:

  • Developed an integrated data mart to collect the various information from the legacy systems for producing the analytical reports and Ahom query, which included developing ETL workflows to extract the source systems data from the distributed environment and load into the data mart database.
  • Designed the Star-Schema based warehouse after understanding the business logic.
  • Designed and developed Powerplay cubes and reports in Cognos.

Hire Now