Sr. Data Architect/data Modeler Resume
Atlanta, GA
SUMMARY:
- 10+ years of experience in Data Modeling and Data Architect, Data governance, Data Analysis, production support, database management, strategic analysis, requirements gathering, application and decision support.
- Very good experience in Extraction, Transformation and Loading(ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle12c/11g/10g,Netezza, SAS, Teradata and SQL Server Databases.
- Expertise in EIM suite tools like SAP BODS, Information Steward, and Data governance end to end implementation.
- Administration of BODS/BOBJ Basis on windows and unix platforms, management console, and CMC
- Optimized SAP BI extractor designs to extract Millions of records for full and delta extracts to work in conjunction with SAP BODS extraction.
- Developed custom BODS recovery mechanisms to recover failed delta extracts
- 6+ Years of experience in Tableau Solutions Consultant with BI experience.
- Experienced in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems using ERWIN9.6 and Power Designer.
- Experienced in Ralph Kimball and Bill Inman Methodologies (Star Schema, Snow Flake Schema) and experienced in Object oriented data modeling using tools like Erwin, ER/Studio and Power Designer, for both forward and reverse engineering.
- Excellent knowledge in data mart design and creation of cubes using dimensional data modeling - identifying Facts and Dimensions, Star Schema and Snowflake Schema.
- Experienced in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
- Experienced in using Teradata tools like Fast Load, Multi Load, T Pump, Fast Export, Teradata15/14 Parallel Transporter (TPT) and BTEQ.
- Experienced in developing Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
- Experienced in SAS with SAS/BASE, SAS/MACROS, SAS/GRAPH, SAS/ACCESS, SAS/CONNECT, SAS/ETL, SAS ODS& Enterprise Guide (EG).
- Good experienced in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for effective and optimum performance in OLTP and OLAP environments.
- Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
- Expertise in implementing ETL (extract, transform and load) by usingOracleWarehouse Builder (OWB).
- Experienced with DDL and DML including Joins, Functions, Indexes, Views, Constraints, Primary Keys and Foreign Keys.
- Experienced in designed the SQL queries and PL/SQL procedures for getting the data from all systems to Data Warehousing system.
- Experienced in the Integration of various data sources with multiple relational databases like Oracle, MS SQL Server2014,Netezza, Teradata, SAS and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experienced in performing performance improvements and resolving production problems related to WCC/MDMServices.
- Expertise in distributed query performance tuning by using EXPLAIN PLAN, SQL Trace and TKPROF Utilities, hints provided byOracle.
TECHNICAL SKILLS:
Analysis and Modeling Tools: Erwin r9.6/r9.5/r9.1/r8.x, Sybase Power Designer, Oracle Designer, BP win ER/Studio, MS Access 2000, Oracle, Star-Schema, Snowflake-Schema Modeling, and FACT and dimension tables, Pivot Tables.
OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.
Oracle: Oracle 9i, 10g, 11g,12c, R2 database servers with RAC, ASM, Data Guard, Grid Control and Oracle Golden Gate(Oracle Enterprise Manager),Oracle Data Guard, SQL* Net, SQL Loader and SQL*PLUS, AWR,ASH, ADDM, Explain Plan.
ETL Tools: SSIS, Pentaho, Informatica Power Center 9.7/9.6/9.5/9.1 etc.
Programming Languages: Java, Base SAS and SAS/SQL, SQL, T-SQL, HTML, Java Script, CSS, UNIX shells scripting, PL/SQL.
Database Tools: Microsoft SQL Server 2014/2012/2008/2005 , Teradata, Oracle 12c/11g/10g/9i, and MS Access, Poster SQL, Netezza, SQL Server, Oracle.
Web technologies: HTML, DHTML, XML, JavaScript
Reporting Tools: Business Objects, Crystal Reports
Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX Windows 95, 98, Windows NT, Windows XP, 7.
Tools & Software: TOAD, MS Office, BTEQ, Teradata 15/14.1/14/13.1/13 , SQL Assistant
Other tools: TOAD, SQL PLUS, SQL LOADER, MS Project, MS Visio and MS Office, Have worked on C++, UNIX, PL/SQL etc.
PROFESSIONAL EXPERIENCE:
Sr. Data Architect/Data Modeler
Confidential - Atlanta GA
Responsibilities:
- Design and develop data warehouse architecture, data modeling/conversion solutions, and ETL mapping solutions within structured data warehouse environments
- Reconcile data and ensure data integrity and consistency across various organizational operating platforms for business impact.
- Define best practices for data loading and extraction and ensure architectural alignment of the designs and development.
- Extensive Tableau Experience in Enterprise Environment and Tableau Administrator Experience including technical support, troubleshooting, report design and monitoring of system usage.
- In-depth knowledge on Tableau Desktop, Tableau Reader and Tableau Server.
- Experience in Installation, Configuration and administration of Tableau Server in multi-server and multi-tier environment.
- Experience on building dashboards in Tableau and Involved in Performance tuning of reports and resolving issues within Tableau Server.
- Created logical data model from the conceptual model and it's conversion into the physical database design using ERWIN9.6.
- Established Data Governance around SAP Master Data objects like Customers, Materials, Vendors etc. using SAP BODS/BO Information Steward/DQM
- Installed BO Information Steward, cleansing package, address directories, dashboards and Integration to BODS 4.0/4.1.
- Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
- Worked in data from Flat files,SQL Server, Oracle, DB2, and Sybase and load the data into flat files, Oracle12c and SQLServer2014using Informatica Power center9.5.
- Extensively worked on Teradata15tools & utilities like Fast Load, Multi Load, TPump, Fast Export, Teradata Parallel Transporter (TPT) and BTEQ.
- Designed data models leveraging multi-dimensional /starschemasand extendedstarschemasfor sales & distribution and finance & controlling reporting.
- Involved in designing and architecting data warehouses and data lakes on regular (Oracle, SQL Server) high performance (Netezzaand Teradata) and big data (Hadoop - Hive and HBase) databases.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
- Created Entity Relationships diagrams, data flow diagrams and implemented referential integrity using ERWIN.
- Involved with Data Analysis Primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Worked on Change Data Capture process to replicate transactional data from Sterling IBM, DB2 database to CDWNetezza, DB2database.
- Extensively involved in developing ETL processes, loading and validating data from source systems into the data warehouse using Informatica and/or Oracle Data integrator (ODI) and Oracle SQL, PL/SQL procedures.
- Conducted database performance tuning techniques (database objects, SQL, T-SQL, and PL/SQL) includingNormalization/De-normalization, Indexes, Table Partitioning, Parallel Processing, Caching, Data Compression.
- Involved in transform and load datasets with basic modeling variables for each segment type Using PROC IMPORT and SQL queries, imported data from Oracle database into SAS files
- Created SAS datasets from Oracle database with random sampling technique and created Oracle tables from SAS datasets by using SAS Macros.
- Created jobs to extract ECC XML messages using various stages MQ connector, XML transformation stage to load intoNetezzatables.
- Troubleshoot and enhance existing reporting for the changing business needs, analyze AML/Fraud inquiries on MoneyGram system to gather additional facts.
- Involved in Designing the ETL process to Extract translates and load data from OLTPOracle database system to Teradata data warehouse.
Environment: Erwin 9.6, DB2, Oracle 12c, SQL, PL/SQL,Tableau Teradata15, SQL Server2014, Information,, Teradata SQL, SAS, T-SQL, MDM, UNIX, Netezza, Business Objects,Teradata, BTEQ, MLDM MDM.
Sr.Data Architect/ Data Modeler
Confidential - Malvern PA
Responsibilities:
- Involved in Managing data design standards, master data and metadata management.
- Designed/Build/Maintained all aspects of the underlying data science and analytics data warehouse/data marts.
- Involved in Steward of Enterprise Data Lifecycle Management including data storage, data retention, backup/recovery and data security technologies
- Extensive Tableau Experience in Enterprise Environment and Tableau Administrator experience including technical support, troubleshooting, report design and monitoring of system usage.
- Involved in upgrading Tableau platforms in clustered environment and performing content upgrades.
- Manage data governance by bringing together stakeholders and driving data quality and leads strategy &architecture for Data Management
- Worked on data extraction (ETL) from the EDW of SAP BI reports, SAP ECC6.0 transaction data of SAP FI, HR and SCM (Sales and Inventory) using Business Objects SAP RapidMarts
- Worked with SAP Data Source drivers such as SAP R/3, SAP BW
- Performed Data Modeling like Dimensional Data Modeling, StarSchema Modeling, Snow-FlakeModeling, FACT and Dimensions Tables using Analysis ServicesErwin9.5.
- Gathered information from different data warehouse systems and loaded into One Sprint Financial Information System Consolidated model using Fast Load, Fast Export, Multi Load, BTEQ and UNIX Shell Scripts.
- PerformedDataAnalysis andDataProfiling and worked ondatatransformations anddataquality rules.
- CreatedNetezza,DB2 Objects using Best practices ofNetezzain order to avoid data skews on theNetezzaObjects.
- Designed both 3NFdatamodels for ODS, OLTP systems and dimensionaldatamodels using star and snowflake Schemas
- Extracted meaningful data from unstructured data on Hadoop Ecosystem.
- Worked with different sources such as Oracle, Teradata, SQL Server2012 and Excel, Flat, Complex Flat File, COBOL files.
- Performeddatamanagement projects and fulfilling ad-hoc requests according to user specifications by utilizingdatamanagement software programs and tools like Perl, Toad, MS Access, Excel and SQL.
- Involved in designing ER diagram usingOracle11gdesigner, to set the logical and physical relationships of database.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS.
- Created schemas, data types, primary & foreign keys, default values, table space, hashing, partitioning, define volumetrics,Normalization and De-normalizationof model (where necessary), and help define access requirements.
- Provided technical guidance for re-engineering functions of Oracle warehouse operations intoNetezza.
- Extensively worked with Teradata14.1 utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Involved in integration of multiple source systems intoMDM and Participated actively in data profiling and mapping exercises to define data architecture.
- Performed advanced querying using SAS Enterprise Guide, calculating computed columns, using filter, manipulate and prepare data for Reporting, Graphing, and Summarization, statistical analysis mapping, finally generating SAS datasets.
- Worked extensively with Extraction, Transformation and Loading of data from Multiple Sources (oracle,Netezza, Teradata, fixed width and delimited flat files) to ORACLE using Informatica.
Environment: Erwin9.5, UNIX, Taradata14.1, MDM/Activators, SAS/BASE, SAS/MACRO, SAS/STAT, and SAS/GRAPH, PL/SQL, ETL Informatica, Tableau,T-SQL, SSRS, SSIS, DB2, SQL Server2012, Data Stores, Netezza, Aginity, DB2, Oracle11g and Teradata SQL Assistant etc.
Sr. Data modeler/ Data Analyst
Confidential - Southfield, MI
Responsibilities:
- Worked extensively with Dimensional modeling, Data migration, Data cleansing,ETL Processes for data warehouses.
- Analysis included data reports, generalized reports, SQL queriesTaradata14toNetezzaand DB2 platform, variable distribution reports for clients as well as vendors
- Developed Teradata13.1 BTEQ scripts to populate target tables, Responsible for data profiling, data mapping, data loading, and data validation.
- Involved in OLAP and OLTP Systems, Dimensional modeling usingStarschemaand Snowflakeschema.
- Worked onNormalizationandDe-Normalizationtechniques for both OLTP and OLAP systems.
- Created tables, views, sequences, triggers, table spaces, constraints and generated DDLscripts for physical implementation.
- Loaded data into the OLTP using Fitness DB-Fit,Oracle,SQLDeveloper& the OLTP.
- Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, FastLoad, Multi Load, Fast Export, Tpump on UNIX/Windows environments and running the batch process for Teradata.
- Tested the ETL process for both beforedatavalidation and afterdatavalidation process. Tested the messages published by ETLtool anddataloaded into various databases.
- Generated DDL scripts using Forward Engineering technique to create objects and deploy them into the databases.
- Performed data analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata.
- Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
- Developed SQL scripts for loadingdatafrom staging area to target tables.
- Created in tables, indexes and designing constraints and wrote T-SQL statements for retrieval of data and involved in performance tuning of T-SQL Queries and Stored Procedures.
Environment: Erwin 9.1, Taradata14, Oracle10g, PL/SQL, UNIX, MDM, ETL,Tableau BTEQ, SQL Server, Netezza, DB2, T-SQL, SQL, Informatica, SSRS, SSIS, T-SQL, XML, SAS Data etc.
Sr. Data Modeler /DataAnalyst
Confidential - Plano, TX
Responsibilities:
- Worked in Data warehousing methodologies/Dimensional Data modeling techniques such as Star/Snowflake schema using ERWIN9.1.
- Created jobs in DataStage to import data from heterogeneous data sources like Sybase, Oracle, Text files and SQL Server2008.
- Extensively used AginityNetezzaworkbench to perform various DDL, DML etc. operations onNetezzadatabase.
- Designed the Data Warehouse and MDM hub conceptual, logical and physicaldatamodels.
- Involved in Perform Daily Monitoring ofOracleinstances usingOracleEnterprise Manager, ADDM, TOAD, monitor users, table spaces, memory structures, rollback segments, logs, and alerts.
- Used CA ERwinData/Modeler(ERwin) fordatamodeling (datarequirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems anddatamarts.
- Involved in customized reports using SAS/MACRO facility, PROC REPORT, PROC TABULATE and PROC.
- UsedNormalizationmethods up to 3NF andDe-normalizationtechniques for effective performance in OLTP systems.
- Involved in database testing, writing complex SQL queries to verify the transactions and business logic like identifying the duplicate rows by using SQLDeveloperand PL/SQLDeveloper.
- Worked ondataprofiling anddatavalidation to ensure the accuracy of thedatabetween the warehouse and source systems.
- Worked on Data warehouse concepts like Data warehouse Architecture,Starschema, Snowflakeschema, and Data Marts, Dimension and Fact tables.
- Developed SQLQueries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects.
- Involved in Database migrations from legacy systems, SQL server to Oracle andNetezza.
- Used SSIS to create ETL packages to validate, extract, transform and load data to pull data from Source servers to staging database and then toNetezzaDatabase and DB2 Databases.
Environment: Erwin, Data Modeling, Star schema, Teradata13.1, SQL, PL/SQL, BTEQ, DB2, Oracle, MDM, Netezza, ETL, HTML, XML, RTF and PDF, UNIX, SQL Server2008, (UDDI, WSDL, SOAP, RPC, XML, and ESB), etc.
Data Analyst
Confidential - Hartford, CT
Responsibilities:
- Created position as a Talent AcquisitionDataAnalystwith a focus on responding to Ad-hocData Analysis.
- Excellent knowledged in Extraction, Cleansing and Modification of data from/to various Data Sources like Flat Files, Sequential files, Comma Delimited files (csv), XML and Databases like Oracle, ODBC, DB2, Teradata13 etc.
- Worked in data from a wide variety of Sources like Flat files, XML files, Relational Databases (Oracle9i, SQL Server, Postgress andNetezza) and from the legacy Mainframes and SAP source systems by using Informatica Power Center.
- Involved in building Data Marts and multi-dimensional models likeStarSchemaand SnowflakeSchema.
- Involved in SAS/BASE, SAS/SQL, SAS/Macro, and SAS/Connect, SAS/Access, SAS/ODS, SAS/Stat and SAS/Graph.
- Involved in loading the data intoNetezzafrom legacy systems and flat files using complex UNIX scripts.SAS/BASE, SAS/SQL, SAS/Macro, SAS/Connect, SAS/Access, SAS/ODS, SAS/Stat and SAS/Graph
- Experience in implementingMDM(Master Data Management) and DQS (Data Quality Services) using SQL Server2005.
- Worked in Data migration between databases usingOracle export/import utility andOracleexp/imp data pump.
- Analyzed Developer's Codes and assistingdevelopers’team to implement with appropriate Codes into bothOracles.
- Worked with SQL, SQL PLUS, Oracle PL/SQL Stored Procedures, Triggers, SQL queries and loading data into Data Warehouse/Data Marts.
- Involved inNormalizationandDe-Normalizationof existing tables for faster query retrieval.
Environment: Netezza, DB2, T-SQL, DTS, SSIS, SSRS, SSAS, ETL, Hadoop, MDM, 3NF andDe-normalization, SQL Server2005, Teradata13, Oracle9i, (StarSchema andSnowFlake Schema)SAS/BASE, SAS/SQL, SAS/Macro, SAS/Connect, SAS/Access, SAS/ODS, SAS/Stat and SAS/Graph etc.