We provide IT Staff Augmentation Services!

Senior Data Architect/big Data Analytics Employee (full Time) Resume

4.00/5 (Submit Your Rating)

Atlanta, Ga

TECHNICAL SKILLS:

Database Platform(s): DB2, Teradata, Oracle, SQL Server, Informix, Cassandra, Hadoop, UDB, MySQL

Cloud Platform(s): Google (BigQuery, BigTable, GCF), Pivotal Cloud Foundry

Big Data: Hadoop, Data Lake, Data Factory, HBase, Mongo, Hive MapReduce, Spark, Yarn, Tableau, Pig, Flume

Data Architecture Tool(s): Embarcadero (ER/Studio), Erwin

Data Migration Tool(s): Informatica

Programming Language(s): C/C++, Perl, SQL, Pro C, Korn/Bash Shell Script, Java, Python, CQL, Spark SQL

Operating System(s): Unix, z/OS, Linux

Methodologies: Agile, Waterfall, Rapid Application Development (RAD)

Other: SAP, Oracle General Ledger, HP Quality Center, Metadata Repository (ARz/OS), Visio, Virtual System Access Method (VSAM) files, Enterprise Data Warehouse (EDW), Web Services, Domain Driven Design

PROFESSIONAL EXPERIENCE:

Confidential,Atlanta,Ga.

Senior Data Architect/Big Data Analytics Employee (Full Time)

Resposibilities:

  • Worked with software engineers and business analysts on the data requirements, design and creation of multiple OLTP and OLAP data models in the following areas: Finance, Supply Chain, Point-Of-Sale, Master Data, Sales/Marketing, IT Security, and Merchandise on various platforms such as DB2, Teradata (EDW), Oracle, MS SQL Server and Informix.
  • Worked with software engineers, data scientist, and business analyst on the data requirements, design and creation of a data factory to house transaction data sourced from internal and external environments used for analytical processing and a source for the Google cloud platform. Tasks also included the creation of processes to view and migrate data in and out of the data factory using tools such as MapReduce, Sqoop, Bash script, Flume, Tableau and Spark SQL on Hadoop database components such as HDFS, Hive, Hbase, Mongo and Yarn.
  • Worked with the enterprise architects on the conversion of sales data from Teradata 3NF models into a Google’s BigQuery. The main objective was to convert the normalized tables’ data from Teradata into a single record for Google’s BigQuery and achieve the same result on reports from both platforms.
  • Worked with software engineers on data architecture associated with the following aspects: Sharding, MemTable, SSTable, Tombstone, Compaction and Client Consistency Level.

Environment: DB2, Teradata, Oracle, MS SQL Server, Informix, Cassandra, Google BigQuery, HDFS, Hadoop (Hive), Yarn, UDB, MySQL, Arz/OS, VSAM, Spark, Hbase, Mongo, Linux

Confidential,Alpharetta, Ga. Lead Data Migration Engineer Consultant

Resposibilities:
  • Created data models, source-to-destination mappings, data dictionaries and naming standards.
  • Created SEMS interactive Data Mart to contain product license data.
  • Created an Autonomy consolidated Operational Data Store (ODS) that served as the source of orders, products, etc. processed by SAP’s Next Generation Order Management system.
  • Created source-to-destination mappings for migration of Autonomy data into HP’s order, customer, product and order fulfillment systems (SAP, SPARKS, SEMS and AutoPass License Server).
  • Created Informatica applications used for migration of Autonomy’s data into SEMS from the HP Fulfillment system.
  • Led requirements sessions.

Environment: AutoPass License Server, Poetic, Unix, Oracle, SQL Server, SAP, TIBCO, Product Download and Payment Interface system (PDAPI), Support Process And Resolution Knowledge System (SPARKS), Software Support Online (SSO), Software as a Service (SaaS)

Confidential,Atlanta, Ga.

Data Architect Consultant 

Resposibilities:
  • Profiled legacy Data Warehouses and operational systems.
  • Created data models, source-to-destination mappings, data dictionaries and naming standards.
  • Created Fact, Dimension, Look-Up and Aggregate tables.
  • Created Informatica applications used for migration of Customer Account data into the new Enterprise Corporate Data Warehouse.

Environment: Unix, Teradata, Oracle

Confidential,Atlanta, Ga.

Data Architect 

Resposibilities:
  • Created data models, source-to-destination mappings and data dictionaries.
  • Created the OWS repository for storage of all customer information across Employment Services, Unemployment Insurance and Workforce-In-Action departments.
  • Led requirements sessions.

Environment: DB2, z/OS, VSAM

Confidential,Birmingham, Ala

Data Architect Consultant 

Resposibilities:
  • Created data models, source to target mappings, data dictionaries.
  • Created Fact, Dimension, Look-Up and Aggregate table.

Environment: Unix, Teradata, Oracle

Confidential, Alpharetta, Ga.

Sr. Business Analyst Consultant (Comsys)

Resposibilities:
  • Created Use cases to illustrate how ASPEN would processes invoice and adjustment files received from Telegence billing (wireless) and reconciled with Telco billing (landline).
  • Gathered requirements for actuals, accruals and journal Entries.
  • Created HP Quality Center Test cases.

Environment: Unix, Telco Billing, ASPEN, Telegence Billing, Oracle General Ledger

We'd love your feedback!