Enterprise Data Architect, Big Data Architect Resume
Minneapolis, MN
SUMMARY:
- Highly competitive professional with more than Sixteen Years of experience in
- Enterprise Data Architecture, Big Data Architecture, and Integration Architecture.
- Employed full time in Oracle, previously in Confidential, Confidential, Confidential - BFL (An Confidential company), Confidential, and Pacific Internet.
- Helping prestigious customers: Amperiprise Financial, Confidential, Confidential, Confidential, and numerous Oracle retail customers in achieving the business goals by architecting and providing enterprise wide solution in areas of Data Architecture, Big Data, Integration, Governance, Performance engineering.
- Good domain knowledge in Finance (Insurance, Investment, Banking, Stocks), Retail, Logistics, Health-care, ISP, Manufacturing.Over a three years of experience as Big Data Architect working with Apache Spark, Cloudera Hadoop, Oracle NoSQL, Hbase, MongoDB, Neo4J and Oracle big data ecosystem.
- More than 5 years of active engagement in Data science for traditional reporting needs and for big data use cases (Pricing, Promotion, Insights).
- Created fundamental know how and practices of spark map reduce programming using Scala for real time framework and Hadoop Java Map-reduce on Yarn for batch oriented solutions.
- Lead creation and organizing large-scale data using Cloudera Hadoop and Apache Spark frameworks for predictive modeling, batch and real time analytics in complex data analytics projects.
- Usage of Dataframe and Datasets in Apache Spark along with Apache Hive for schema, data sampling and quality check. Created data quality and governance process with performance consideration (Map reduce partitioning, Block size optimization, Hive partition).
- Helping Data Analyst in understanding the data and applying various algorithms in statistics (Linear Regression, T and P Distribution, Ki and many more) for data visualization using R.
- Used Oracle Data Integrator and SQOOP to load data from structured data sources into HDFS and Hive tables.
- Created use case for data augmentation of existing products (pricing engines) to empower with broader views and more insight capabilities.
- Created Big Data Road Map in Retail space for Analytics solution, created demos and roadshows to senior management and leadership for buy in.
- Combing data from OLAP and Big data for historical and current trends.
- Augment and empower existing applications with data for making better decisions (pricing and promotion space) and creation of new product offerings in BI and Analytics space.
- Created various proof of concepts in Big Data Solution by sourcing data from Structured, Semi-structured and unstructured data i.e., Relational, Weblogs, External Data, Social media, Weblogs and various other data sources. Information converted to valuable data insights and data augmentation for decision, visualization and prediction process.
- Married MDM solutions with Big data for enterprise Data solution, data Enrichment and data quality.
- Extensive experience in Enterprise Data Architecture in areas of transactional OLTP and OLAP areas, Data Modeling using Boyce Codd normalization techniques, Normalization and Star-Snowflake schema. Implemented Data Architecture framework: Zachman, Confidential, TOGAF.
- Enforced Data Governance, Data Quality using in-house, script based and tool based solution also storage and creation and storage of Metadata using tools.
- OLAP solutions using both Bill Inmon’s third Normal form Data warehouse and Kimball’s Dimensional modeling (Star and Snowflake schema).
- Incorporated data warehouse solution for Analytical, Aggregation and Historical needs.
- Created operation Data Stores for reporting and Analysis needs. Created Data Mart for domain specific Analytical requirements.
- Creation of Canonical data models for services framework and contracts and abstraction of information.
- Involved in Data Integration activities including bulk data integration using Oracle Data Integrator(ODI), SQL Loader.
- Integration Architecture using Bulk/Batch data, web services, SOA and Message based integration.
- Very proficient in business analysis and translating business requirements to technical requirements using various tools like Visio, Rational Rose and case tools like Erwin, Power designer, Oracle designer 2000, Oracle SQL Data modeler.
- Created and adopted to Enterprise strategies, Architecture and frameworks and adhering to Industry standards like acord insurance, Confidential banking & Financial Models, and NRF.
- Follow when applicable Industry standard Architectural frameworks Zachman and Oracle Enterprise Architecture.
- Extensive experience in performance engineering both from SQL querying and AWR report analysis.
- Good Oracle Database administration (DBA) and Performance tuning skills and expert Database Programming (Oracle PL/SQL).
- Involved in Data Analysis, Data profiling for legacy systems (Mainframe and other Legacy systems). Involved in project DevOps.
- Good communication skills, interpersonal skills, self-motivated, quick learner, team player, hard working and flair to succeed. Good team player, leader, management, highly motivated, Client friendly, management and communication skills.
TECHNICAL SKILLS:
Framework: Cloudera Hadoop Dist., Apache Spark, Oracle Big Data, Zachman, TOGAF and Confidential Banking Industry framework. Ingestion and Integration tools (ETL): Oracle data Integrator (ODI), Sqoop, Flume, Kafka, Integration Bus and Service Bus.
Data Storage: Cloudera HDFS, Oracle NoSQL, Hbase, ORACLE 12c, 11g, 10g, 9i, 8i, 8.0/7.X, SQL Server6, MS Access, and Teradata
Languages/Scripting: Spark-Scala, R, PL/SQL, SQL, XML, Hadoop Mapreduce (Java), C++, Shell script.HDFS/NoSQL Administration and WorkflowCloudera Manager and Oozie workflow.
CASE /UML Tools: SQL Data Modeler, Erwin, Oracle Designer, Power Designer, Rational.
Applications: Oracle 10SC, Siebel CRM and Remedy AR.
Front End Tools: Oracle 10SC, Siebel CRM and Remedy AR.
PROFESSIONAL EXPERIENCE:
Enterprise Data Architect, Big Data Architect
Confidential
Responsibilities:
- Created various Big Data Proof of concepts using Cloudera Hadoop and Apache Spark framework. Both in single node and fourteen-node cluster.
- Programmed in MapReduce (Java) and Scala on HDFS for batch processing and real time analytics.
- Stream analysis of Web server logs and retail social data (Twitter and product reviews) into HDFS and map-reduced using spark scala program. Abstracted to be consumed by Data Analyst using R using various theorem (Linear Regression, T and P Distribution).
- Storage of Data in multiple data stores Oracle NoSQL, HDFS and Oracle Database. Combing data from Oracle BI and Big data for data augmentation.
- Data Augmentation of existing products for broader and more current data for near real time analytics.
- Data Ingestion using Flume, Kafka and Oracle Data Integrator (part of Oracle connectors for big data)
- Performance tuning at Storage (HDFS-Block optimization, Partitioning and Reducer tuning), Memory tuning (execution and storage).
- Created Hive queries for complex join between datasets (DF).
- Map-reduce using Hive, Pig and Scala.
- Hadoop Administration using Cloudera Manager. Workflow using Apache Oozie.
- Architected and created Big data POC based on batch, real time data needs, architected hybrid cloud solution for information sharing.
- Defined Data Architecture Framework for various Oracle Retail product areas.
- Creation Enterprise Data Model as part of Architecture and Enterprise Data warehouse.
- Involved in various Data Analysis, Data Quality, Data Integration, Data Sourcing and Data Mapping.
- Define and implement Meta and Master Data Management, Data Governance and Data Quality
- Data Modeling - Created project wise Conceptual, Logical, and Physical data model for both transactional -OLTP, and multidimensional OLAP environments. Socializing various artifacts with senior leadership and stakeholders.
- Source to target mapping i.e., Data Mapping. Identifying data sources and targets, applying business rules, consolidation of information.
- Business Requirements discussions/ conducted client interviews and converting business requirements to information needs using various models like data flow diagram.
- Creating enterprise wide Process flow diagrams for domain views.
- Solution proposals i.e., Real time analytics using Golden gate, consumption layer based on the budget.
- Performance Analysis and design recommendations from Data Architecture perspective.
- Best practices recommendations from Data Architecture perspective.
- Created Data Governance, Metadata management.
- Involved in Integration Architecture (Concepts on Retail Integration Bus, Retail Service Bus, Bulk Data).
- Supporting development team in dev-ops i.e., build process using Gradle.
- Bulk Data Integration using Oracle Data Integrator and Hadoop connectors, Bulk/Batch processing, catalog creation.
Skill/Tool: SQL Data modeler, MS Office, Visio, Power Point, Oracle data integrator, ODI 11,12c, Oracle 12c, 11gr2 DB, WLS 12.1.0, Java SE 1.7, 1.8, Oracle NoSQL, Hive, Flume, Oozie Workflow, HDFS, Integration and Service Bus.
Enterprise Data Architect
Confidential, Minneapolis, MN
Responsibilities:
- Data Analysis, Data Quality, Data Integration, Data Sourcing and Data Mapping.
- Meta and Master Data Management.
- Data Modeling - Conceptual, Logical, Abstraction both OLTP and OLAP environments.
- Source to target mapping i.e., Data Mapping.
- Abstraction modeling for SOA in service layer.
- Business Requirements discussions/ Client interviews.
- Create Object Models using Rational tool RSM.
- Process flow diagrams.
- Multiple Recommendations and Solutions based on the budget.
- Performance Analysis and recommendations from Data Architecture perspective.
- Best practices recommendations from Data Architecture perspective.
- Interaction with DBA and Data Integration team and support hand over tasks.
Skill/Tool: Erwin 7.2, Power Designer, MS Office, Visio, Rational Rose RSM 7.1, Power Point/
Data Architect, Database Analyst, Consultant, DB Programmer
Confidential, Charlotte, NC and New York, NY
Responsibilities:
- Understand the Client Requirements, client interviews.
- Prepare technical design documents
- Analysis of the existing functionality.
- Efficient data solution and suggestion to existing problems and recommendation.
- Data modeling using Erwin, Architecting, Data Analysis, Gap Analysis of existing systems and legacy apps.
- Creation of Conceptual, Logical data models for various enterprises wide apps.
- Involved in non-functional design and SOA designs.
- Leading a team of five for eB1 and CDI.
- Database Programmer.
- Generation of DB objects
- Back End Stored Procedures Functions, DB Triggers, EOD Jobs
- Performance/Query tuning
- Supporting testing activities both functional and database.
- Database Recommendations for StateDaemon.
- Pre and Post go live production support.
- Create Informatica workflow, Mappings. Create ETL Process and Mappings using SQL Loader and PL/SQL Packages.
- SQL Loader process to load the data from flat files.
Data Architect, Data modeler, Data Consultant and DB Programmer
Confidential, New York
Responsibilities:
- Understand the Client Requirements./Interact with the client
- Data modeling, Architecting, Data Analysis.
- Creation of Logical and physical data models for ASO app.
- Sizing estimation and Data profiling.
- Generation of DB objects
- Back End Stored Procedures Functions, DB Triggers, EOD Jobs
- Performance/Query tuning on Database and SQL queries
- Testing -Functionality.
- Supporting testing activities.
- Critical role in Disaster recovery exercise from application perspective.
- Developing reports using Crystal reports.
Database Analyst and Data Consultant
Confidential
Responsibilities:
- Understand the Client Requirements./Interact with the client
- Data modeling, Architecting, Data Analysis, Gap Analysis of existing systems and legacy apps (mainframe, Adabas).
- Creation of Conceptual, Logical data models for various Order and Inventory apps.
- Involved in non-functional design and SOA designs (sizing estimates, backup recovery, DR).
- Generation of DB objects.
- Back End Stored Procedures using PL/SQL Functions, DB Triggers, EOD Jobs
- Performance/Query tuning
- Testing -Functionality.
- Supporting testing activities.
- Production support.
- Oracle Forms and Reports.
- Shell Scripts.
- Recommendation for future enhancements from application and database perspective.
Skill/Tool: Oracle 8i R2, SQL PLUS 8.1.6, PL/SQL 91, Stats-Pack, TKPROF, Oracle Expert, PL/SQL Developer, Developer 2000, Shell scripts.