We provide IT Staff Augmentation Services!

Data Solution Architect Principal Resume

4.00/5 (Submit Your Rating)

Falls Church, VA

SUMMARY:

  • 18+years of Lead data architect/ data modeling experience in the areas of Finance, Retail, Supply chain, Manufacturing, data warehousing.
  • Fast technology learner and constantly updating/adopting of current technology developments
  • Extensive knowledge in building enterprise wide data warehouse applications using Relational, multidimensional Model, Cubes
  • Experience in Project Management methodology with full cycle of SDLC, Agile4/scrum methods in Client/server application in Unix/ Linux using Bigdata/AWS cloud, SQl and NoSQL database projects. .
  • Good functional knowledge of all business processes, improvement and process re - engineering, mapping to system process design methods and transforming to technical architecture solutions
  • Architecture and sizing estimate for Hadoop nodes and cluster (memory and storage)
  • Configure and sample results in Hive, Hbase, Sqoop, Flume, Pentaho integrator/analytics
  • Install Name and data nodes setup and configure all services
  • Install components of HDFS zoo keeper, Hive, Hbase kafks, spark, NiFi and storm
  • Configure connectivity with Name and data nodes with all services
  • Check all log files and storage locations and end to end testing
  • Data design Planning for Big data analytics combining Hadoop and data warehouse
  • Hive table design and various performance tuning options
  • Create a log sheet for all the software components and versions
  • Create documentation of setup and backup of original setting

SKILL:

Database (15 years): Oracle,SQL Server,Teradata, Mainframe DB2/UDB, Sql?NoSql

OLAP Tools (6 years): Business Objects, Oracle Express/DiscovererSQL Server 2000/2005 DTS/ Analyze servicesCOGNOS Enterprise server, Power Play, Visualizer Scheduler etc, Microstrategy, Pentaho analytics

Knowledge in: Microsoft Technologies NET, XML, Java/J2EE, AI/MLETL Tools (5 years) Informatica, data stage, Cognos, SQL Server DTS Services, Oracle Data warehousing tools Datastage, OWB, ODI, Infosphere

Data Modeling/Data warehousing: Process Design (12 years) Enterprise Data warehouse design/model architecture, Virtual/Self service data analaytics, with Rational Rose/UML, ERWIN, ERstudio, Oracle Designer, Visio, MS Office tools

Big data tools (Experience and knowledge) (5 years ): Hadoop, MapReduce, Hbase, Cloudera, MongoDB, Lucene/elastic-search, ZooKeeper, kafka/spark/Storm,Neo4j, Scoop, Flume, Pentaho,Splunk, Solr/LuceneCloud Data Solution (storage & tools) 3 years AWS cloud storage and tools selections, setting up of Data lake architecture with structured/unstructured data, Data ingestion tools, Predictive, data mining Analytics tools vendor evaluations,,Open source tools

PROFESSIONAL EXPERIENCE:

Confidential, Falls Church, VA

Data Solution Architect Principal

Responsibilities:

  • Proposal for Geo spatial data solution using big data spatial/AWS
  • Proposal for Capacity planning, hardware sizing, architecture for aviation
  • Proposal for Multi source data integration using Azure data lake, data factory
  • Delivered solution for DW,MDM, data governance practice to a Fed client
  • Design various architecture roadmap in DW/Big data design/architect for Anti fraud solution for client
  • Capacity planning, (100TB) hardware sizing, architecture write up for proposal
  • Demo preparation and planning with big data and Fraud COTS product
  • Review BI and data mining, predictive analytics products with Business/systems analysts
  • Participated IV&V, STIG compliance process for system suitability & testing.
  • Delivered DW design (Redshift) performance as new/existing design/improvement
  • Proposals/solution for Navy,VA,FEMA,CMS,FAA in Data management areas

Environment: Oracle, Cloudera,, Apache Hadoop Scoop/Spark,Hive,Hbase AWS Redshift, Azure,Pentaho BI, MongoDB, Json, xml, Informatica

Confidential, Washington, DC

Sr Data Architect

Responsibilities:

  • Developed Enterprise logical/physical model complying FEAF framework
  • Design from transaction sources, financials, Salesforce about broker/data
  • Design ODS/Datawarehouse using Oracle/IBM Big Insight Hadoop framework
  • Data analytics using Blue analytic sheets from Neteeza DB storage
  • Data design Planning for Big data analytics combining Hadoop and Neteeza storage
  • Data transformation plan to Hadoop and out of Hadoop to Oracle for further reporting
  • Architecture and sizing estimate for Hadoop nodes and cluster
  • Configuring and sample results in Hive, Hbase, Sqoop, Bigsheet
  • Strategic plan for web analytics for social media using Elastic search, Kibana

Environment: Oracle, Neteeza, Sybase, ER Studio, IBM Big Insights, Apache Scoop/Spark, Python, Java/J2EE, AWS S2,EC2, Pentaho, MongoDB, Watson, Informatica, cognos, Oracle Federal financial

Confidential, Herndon, VA

Sr. Data Architect/Enterprise Architect

Responsibilities:

  • Involve in Enterprise data warehouse effort in identifying prime source systems
  • Track enterprise, level in data Governance and model standards using Erwin repository
  • Grouping applications for business process, business and service applications including internal/external
  • Enterprise data security, quality and naming standards, MDM (DRM) and TOGAF
  • Open payments Health care project as data architect (transaction and Datamart)
  • Data Planning & preparation for Big data Pilot project using Cloudera/Apache/Hbase/Pig/R
  • Hive table design and various performance tuning options
  • Configure and test Sqoop/Spark/Shark for data ingestion in Hadoop
  • DW design (Kimball) from sources like financials, HR and custom applications
  • Design analytics using Pentaho analytics model

Environment: Oracle 11g, Micro strategy, DB2, SQL server, Erwin, Unix, Sharepoint, Bigdata(Cloudera, Hive, Hbase, R, Spark/kafka, Impala, Tableau, SAS,Python, Java/J2EE, Jenkins AWS, Pentaho, Mongo DB, cognos, Oracle federal financials,Informatica

Confidential, Washington, DC

Lead/Data Architect

Responsibilities:

  • Establish data design standards and star schema reporting designs
  • Establish data flow and data quality standards
  • Develop complete ETL design and controls to load target tables
  • Design logical, physical of Oracle 11g database with metadata information
  • Pilot Project on Hadoop using Cloudera/Hbase/Pig /SAS/R/Apache
  • Lead Oracle BI report development and user training
  • Maintain all the project contract, software development, security compliance documentation and follow the submission guidelines (NIST, FEAF)
  • Data Plan/design for Hadoop and Oracle storage and reporting
  • Configure and test Sqoop/Spark/Pentaho for data ingestion in Hadoop and Oracle
  • Maintain the source and data objects standards and version controls

Environment: Oracle 11g RAC, ODI, OBIEE, ArcGis, Web Services and SOA suite, Hadoop, Erwin, Unix, Sharepoint, Bigdata (Cloudera, Hbase, R Data Mining, Oracle SOA service), Java/J2EE,Python, AWS

Confidential, Washington, DC

Lead/ Enterprise Data Architect

Responsibilities:

  • Design fact and Dimension tables (star schema)based on the source Claims and Enrollment data
  • Conversion/migration from COBOL flat file to DB2 Target table for reporting collection
  • Design logical, physical of DB2 Report database with metadata information.
  • Create data mapping, ETL specs for loading from flat file into DB2 target tables
  • Analyze new report requirements and review the target tables design based on HIPAA compliant
  • ICD-9, ICD-10 standards compliant report analysis

Environment: DB2, COBOL, Mainframe, DB Studio, Netezza, Rocket shuttle, Informatica, Erwin, Unix, Sharepoint, Tibco/JMS messaging, IBM Infosphere, Cognos, Informatica

Confidential, Quantico, VA

Lead/ Data Architect

Responsibilities:

  • Develop project scope, data migration plans for Retail and Finance Datamart
  • Design fact and Dimension tables (star schema) based on the source Retail and Finance data
  • Plan and schedule the mapping for data load in DEV and Prod servers
  • Review logical/physical model, table re-org/design, sql-execution plan, indexes, database memory parameters, disk cache sizes, disk storage area size, through put rate while executing reports as performance improvement applying DoDAF framework
  • Create data mapping for Retail and Finance Datamart using OWB
  • Analyze new report requirements to match ETL load mappings
  • Define Data Architecture and EIA standard and data definitions in DODAF standard
  • Define and capture the change process in ETL mapping for documentation

Environment: Oracle10g, OWB 10.2 RAC, ERWIN, Unix, Clear case Tools, Sharepoint, Cognos

Confidential, Reston, VA

Lead/Data Integration Architect

Responsibilities:

  • Building various API interfaces using web services for various business groups and as well as for external vendors common usage to facilitate data interchanges in common XML/XSLT format.
  • Develop data mapping for interface requirements with standard Canonical data model
  • Maintain/update logical/physical Data model according to naming standards and grouping by functions
  • Involved in MDM, Data governance, data quality process in TOGAF
  • Create Datwarehouse mappings using OWB connecting different source systems
  • Create XSD schemas, WSDLs and validate using XML Spy

Environment: Oracle10g,, ERStudio,XML Spy, Contivo, SQL server 2005, Unix, MS Project, Clear case tools, Microstrategy, Informatica, Cognos

Confidential, Mclean, VA

Lead/Sr Business/Data architect

Responsibilities:

  • Develop project scope, data migration plans from people soft HR to ADP external systems
  • Study impact on Data warehouse and staging tables due to migration
  • Review table re-org/design, sql-execution plan, indexes, database memory parameters, disk cache sizes, index clustering factors, through put rate while executing reports as performance improvement
  • Create data mapping for ADP system to Data warehouse using OWB based on SOA
  • Organize the data dictionary effort covering all it systems in the organization

Environment: Peoplesoft HR, Oracle10g, ERStudio, SQL server 2005, Unix, MS Project, Clear case tools, Visio, sharepoint, Informatica, Cognos

Confidential, Mclean, VA

Project Lead - Data Architect

Responsibilities:

  • Maintain repository for data models (Erwin) and data dictionary for seller projects
  • Responsible for change/maintain logical/physical models and generate ddls to implement the change in database (DB2 8.1, Sybase 12.5, Oracle)
  • Manage change control and maintain project artifacts in clear case and clear quest tools as per standard SOX compliance
  • Follow data abbreviation standards using metadata repository rules in logical and physical modeling for project using web services and XML schema for interfaces
  • Discuss business requirements and guide the team in data redundancy, storage, performance while changing the data models
  • Identify/document data aggregation, facts and dimensions for data warehouse module and enhance reporting Options using star schema

Environment: Mainframe, Sybase, DB2, oracle, Erwin, Unix, MS Project, Clear case tools, Visio, Lotus notes, Web services, DB Artisan, Micro strategy, Datastage, Clearcase, Clearquest, Cognos

We'd love your feedback!