Solution Architect Principal Resume
Falls Church, VA
SUMMARY:
- 18+years of Lead data architect/ data modeling experience in the areas of Finance, Retail, Supply chain, Manufacturing, data warehousing. Fast technology learner and constantly updating/adopting of current technology developments Extensive knowledge in building enterprise wide data warehouse applications using multidimensional Model, Cubes, dynamic web reports
- Experience in Project Management methodology with full cycle of SDLC method in Client/server application in Unix/ NT/Novell Environment using regular databases like Oracle 9.0, SQL Server 2000 in web service based projects. .
- Good functional knowledge of all business processes, improvement and process re - engineering, mapping to system process design methods, technical architecture solutions
TECHNICAL SKILLS:
Operating System: Win NT 2000/XP, Unix/Linux, AS/400
Database (15 years): Oracle 11g, Access 2000, SQL Server 2005, Teradata, Mainframe DB2/UDB Exadata
Language Used: SQL Navigator, PL/SQL, Cobol Data General/PC, VB, Python, R, Perl. Scala, SPSS
OLAP Tools: Business Objects, Oracle Express/Discoverer,SQL Server 2000/2005 DTS/ Analyze services,COGNOS Enterprise server, Power Play, Visualizer Scheduler etc, Microstrategy, Pentaho analytics
Knowledge in: Microsoft Technologies VB, .NET, ASP, Frontpage Web components, XML, Java/J2EE, IBM Netezza appliance
ETL Tools: DB2OLAP/ESSBASE using Application Manager, Informatica, data stage
PROFESSIONAL EXPERIENCE:
Confidential, Falls Church,VA
Solution Architect Principal
Responsibilities:- Proposal for Geo spatial data solution using big data spatial/AWS
- Proposal for Capacity planning, hardware sizing, architecture for aviation
- Proposal for Multi source data integration including satellite, radar and weather
- Authored in Big data and cloud, data lake white papers
- Big data design and architect for Anti fraud solution for SSA client
- Capacity planning, hardware sizing, architecture write up for proposal
- Demo preparation and planning with big data and Fraud COTS product
- Review BI and data mining, predictive analytics products
- Managed VPATS product compliance
- Compare client infrastructure with proposal to fill in the gaps
- Proposals/solution for Navy,VA,FEMA,CMS,FAA in Data management areas
Environment: Oracle, Cloudera,, Apache Hadoop Scoop/Spark,Hive,Hbase AWS Redshift, Azure,Pentaho BI, MongoDB
Confidential,Washington,DC
Sr Data Architect
Responsibilities:- DataWarehouse Project with SEC with Big data framework
- Design from transactions systems about broker/company data with SEC
- Design ODS/Datawarehouse using Oracle/IBM Big Insight Hadoop framework
- Data analytics using Blue analytic sheets from Neteeza DB storage
- Data design Planning for Big data analytics combining Hadoop and Neteeza storage
- Data transformation plan to Hadoop and out of Hadoop to Oracle for further reporting
- Architecture and sizing estimate for Hadoop nodes and cluster
- Configuring and sample results in Hive, Hbase, Sqoop, Bigsheet
- Twitter feed analysis in HDFS on stock market data
Environment: Oracle, Neteeza, Sybase, ER Studio, IBM Big Insights, Apache Scoop/Spark, Python, Java/J2EE, AWS S2,EC2, Pentaho, MongoDB, Watson, Informatica, cognos
Confidential,Herndon,VA
Sr. Data Architect/Enterprise Architect
Responsibilities:- Involve in Enterprise data warehouse effort in identifying prime source systems
- Track enterprise, level in data movement and data model standards using Erwin repository
- Grouping applications for business process, business and service applications including internal/external
- Enterprise data security, quality and naming standards, MDM and TOGAF
- Open payments Health care project as data architect (transaction and Datamart)
- Data Planning & preparation for Big data Pilot project using Cloudera/Apache/Hbase/Pig/R
- Hive table design and various performance tuning options
- Configure and test Sqoop/Spark/Shark for data ingestion in Hadoop
- Data design for Hadoop and Oracle usage using sqoop, Flume
- Design analytics using Pentaho analytics model
Environment: Oracle 11g, Micro strategy, DB2, SQL server, Erwin, Unix, Sharepoint, Bigdata(Cloudera, Hive, Hbase, R, Spark/kafka, Impala, Tableau, SAS,Python, Java/J2EE, Jenkins AWS, Pentaho, Mongo DB, cognos
Confidential, Federal, Washington DC
Lead/Data Architect
Responsibilities:- Establish data design standards and star schema reporting designs
- Establish data flow and data quality standards
- Develop complete ETL design and controls to load target tables
- Design logical, physical of Oracle 11g database with metadata information
- Pilot Project on Hadoop using Cloudera/Hbase/Pig /SAS/R/Apache
- Lead Oracle BI report development and user training
- Maintain all the project contract, software development, security compliance documentation and follow the submission guidelines (NIST, FEAF)
- Data Plan/design for Hadoop and Oracle storage and reporting
- Configure and test Sqoop/Spark/Pentaho for data ingestion in Hadoop and Oracle
- Maintain the source and data objects standards and version controls
Environment: Oracle 11g RAC, ODI, OBIEE, ArcGis, Web Services and SOA suite, Hadoop, Erwin, Unix, Sharepoint, Bigdata (Cloudera, Hbase, R Data Mining, Oracle SOA service), Java/J2EE,Python, AWS
Confidential, Washington,DC
Lead/ Enterprise Data Architect
Responsibilities:- Design fact and Dimension tables (star schema)based on the source Claims and Enrollment data
- Conversion/migration from COBOL flat file to DB2 Target table for reporting collection
- Design logical, physical of DB2 Report database with metadata information.
- Create data mapping, ETL specs for loading from flat file into DB2 target tables
- Analyze new report requirements and review the target tables design based on HIPAA compliant
- ICD-9, ICD-10 standards compliant report analysis
Environment: DB2, COBOL, Mainframe, DB Studio, Netezza, Rocket shuttle, Informatica, Erwin, Unix, Sharepoint, Tibco/JMS messaging, IBM Infosphere, Cognos, Informatica
Confidential,Quantico,VA
Lead/ Data Architect
Responsibilities:- Develop project scope, data migration plans for Retail and Finance Datamart
- Design fact and Dimension tables (star schema) based on the source Retail and Finance data
- Plan and schedule the mapping for data load in DEV and Prod servers
- Review logical/physical model, table re-org/design, sql-execution plan, indexes, database memory parameters, disk cache sizes, disk storage area size, through put rate while executing reports as performance improvement
- Create data mapping for Retail and Finance Datamart using OWB
- Analyze new report requirements to match ETL load mappings
- Define Data Architecture and EIA standard and data definitions in DODAF standard
- Define and capture the change process in ETL mapping for documentation
Environment: Oracle10g, OWB 10.2 RAC, ERWIN, Unix, Clear case Tools, Sharepoint, Cognos
Confidential,Reston,VA
Lead/Data Integration Architect
Responsibilities:- Develop data mapping for interface requirements with standard Canonical data model
- Maintain/update logical/physical Data model according to naming standards and grouping by functions
- Involved in MDM, Data governance, data quality process in TOGAF
- Create Datwarehouse mappings using OWB connecting different source systems
- Create XSD schemas, WSDLs and validate using XML Spy
Environment: Oracle10g,, ERStudio,XML Spy, Contivo, SQL server 2005, Unix, MS Project, Clear case tools, Microstrategy, Informatica, Cognos
Confidential,Mclean,VA
Lead/Sr Business/Data architect
Responsibilities:- Develop project scope, data migration plans from people soft HR to ADP external systems
- Study impact on Data warehouse and staging tables due to migration
- Review table re-org/design, sql-execution plan, indexes, database memory parameters, disk cache sizes, index clustering factors, through put rate while executing reports as performance improvement
- Create data mapping for ADP system to Data warehouse using OWB based on SOA
- Organize the data dictionary effort covering all it systems in the organization
Environment: Peoplesoft HR, Oracle10g, ERStudio, SQL server 2005, Unix, MS Project, Clear case tools, Visio, sharepoint, Informatica, Cognos
Confidential,Mclean,VA
Project Lead - Data Architect
Responsibilities:- Maintain repository for data models (Erwin) and data dictionary for seller projects
- Responsible for change/maintain logical/physical models and generate ddls to implement the change in database (DB2 8.1, Sybase 12.5, Oracle)
- Manage change control and maintain project artifacts in clear case and clear quest tools as per standard SOX compliance
- Follow data abbreviation standards using metadata repository rules in logical and physical modeling for project using web services and XML schema for interfaces
- Discuss business requirements and guide the team in data redundancy, storage, performance while changing the data models
- Identify/document data aggregation, facts and dimensions for data warehouse module and enhance reporting Options using star schema
Environment: Mainframe, Sybase, DB2, oracle, Erwin, Unix, MS Project, Clear case tools, Visio, Lotus notes, Web services, DB Artisan, Micro strategy, Datastage, Clearcase/Clearquest, cognos