We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

4.00/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Above 11+ years of experience as Data Architect/Modeler andDataAnalyst with high proficiency in requirement gathering anddatamodeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
  • Experienced in designing Architecture for Modeling a Datawarehouse by using tools like Erwin r9.6/r9.5, Sybase Power Designer and E - R Studio.
  • Experience in SQL and good knowledge in PL/SQL programming and developed Stored Procedures and Triggers and Data Stage, DB2, UNIX, Cognos, MDM, UNIX, Hadoop, Pig.
  • Experience in designing star schema (identification of facts, measures and dimensions), Snowflake schema forDataWarehouse, ODS Architecture by using tools like ErwinDataModeler, Power Designer, E-R Studio and Microsoft Visio.
  • Experience in importing and exportingdatafrom different relational databases like MySQL6.x, Netezza, Oracle into HDFS and Hive using Sqoop.
  • Experience in SQL queries and optimizing the queries in Oracle, SQL Server2014,DB2, Netezza&Teradata.
  • Experience in metadata design, real time BI Architecture includingDataGovernance for greater ROI.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Good understanding of AWS, bigdataconcepts and Hadoop ecosystem.
  • Knowledge of BI and data warehousing principles including data modeling, data quality, extract/transform/load process as they apply to data preparation for visual implementation.
  • Experience Data Modeler with conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, Teradata 15/14, experienced with JAD sessions for requirements gathering, creating Data Mapping, documents, writing functional specifications, queries.
  • Excellent experience in SQL Loader, SQL Data, SQL Data Modeling, Reporting, SQL Database Development to load data from the Legacy systems into Oracle Databases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables.
  • Good experienced in Normalization for OLTP and De-normalization of Entities for EnterpriseData Warehouse.
  • Experienced in Database using Oracle, XML, DB2, Teradata, Netezza, SQL server, Big Data and NoSQL.
  • Good knowledge in Database Creation and maintenance of physicaldatamodels with Oracle, Teradata, Netezza, DB2 and SQL Serverdatabases.
  • Familiar with Kimball DW/BI modeling principles and knowledgeable in Data warehouse modeling for different kind of business.
  • Experience in Teradata RDBMS using Fast load, Fast Export, Multi load, T pump, and Teradata SQL Assistance and BTEQ Teradata utilities.
  • Hands-on experience with the Hadoop ecosystem ( MapReduce, HBase, Pig, Hive, Impala, Sqoop)
  • Experienced in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Abinitio and Informatica Power Center.
  • Experienced in Data Modeling including Data Validation/Scrubbing and Operational assumptions.
  • Very good knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.
  • Extensively experience on EXCEL PIVOT tables to run and analyze the result data set and perform UNIX scripting.
  • Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS)
  • Experience in Normalization and Demoralization processes, Logical and Physical Data Modeling techniques.
  • Experience in Database performance tuning and Data Access optimization, writing complex SQL quires and PL/SQL blocks like stored procedures, Functions, Triggers, Cursors and ETL packages.
  • Experience with SQL Server and T-SQL in constructing Temporary Tables, Table variables, Triggers, user functions, views, Stored Procedures.
  • Excellent problem solving, strong mentoring, communication, and analytical skills with ability to work in a team or individually.

TECHNICAL SKILLS

Data Modeling Tools: Erwin r9.6/r9.5, ER Studio, Sybase Power Designer and Oracle Designer.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

ETL Tools: SSIS, Pentaho, Informatica9.6.

Programming Languages: Java, Base SAS and SAS/SQL, SQL, T-SQL, HTML, Java Script, CSS, UNIX shells scripting, PL/SQL.

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume.

Database Tools: Oracle, Teradata, Netezza, Microsoft SQL Server 2014/2012/2008/2005, and MS Access, PostgreSQL.

Web technologies: HTML, DHTML, XML, JavaScript

Reporting Tools: Business Objects, Crystal Reports

Operating Systems: Microsoft Windows 8/7, UNIX, Linux, Redhat

Tools: & Software: TOAD, SQL *PLUS, SQL*LOADER, MS Office, BTEQ, Teradata SQL Assistant

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.

PROFESSIONAL EXPERIENCE

Confidential, Boston, MA

Sr. Data Architect/Data Modeler

Responsibilities:

  • Massively involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Researched, evaluated,architect, and deployed new tools, frameworks, and patterns to build sustainable BigDataplatforms for our clients
  • Created logical/physical data models using Erwin for new requirements and existing databases, maintained database standards, provided architectural guidance for various data design/integration/migration and analyzed data in different systems.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Used Load utilities (Fast Load & Multi Load) with the mainframe interface to load the data into Teradata.
  • Data reconciliation activities between Source and EDW Teradata databases.
  • Demonstrated experience in design and implementation of an enterprise data model, metadata solution and data life cycle management in both RDBMS, Big Data environments.
  • Applies architectural and technology concepts to address scalability, security, reliability, maintainability and sharing of enterprise data.
  • Performed the data profiling and imported/updated table definition using SQL and Power Center Designer.
  • Changed the session properties to override setting values included source/target table name, schema name, source query, connection string, update strategy and log detail tracing level.
  • Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
  • Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, loaded data into HDFS and Extracted the data from My SQL into HDFS using Sqoop
  • Worked on database design, relational integrity constraints, OLAP, OLTP, Cubes andNormalization(3NF) &De-normalizationof database.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, Multi Load, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Supporting the legacy Teradata ETL process which is developed on Teradata
  • Developed Reports using Business Objects as per the Client Requirements.
  • Involved in several facets ofMDMimplementations including Data Profiling, metadata acquisition and data migration.
  • Tuned SQL statements and analysis query performance issues in Teradata
  • Extensively used AginityNetezzawork bench to perform various DML, DDL etc operations on Netezzadatabase.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Environment: Erwin r9.6, MDM, Hadoop, HBase, HDFS, Sqoop, UNIX, Teradata SQL Assistance, Netezza, Aginity, PL/SQL, ETL, OLAP, OLTP, Oracle12c and Teradata15 etc.

Confidential, Chicago, IL

Sr. Data Architect/Data Modeler

Responsibilities:

  • Accountable for defining database physical structure, functional capabilities, and security, backup, and recovery specifications and for the installation of database systems.
  • Maintain database performance through the resolution of application development and production issues.
  • Presented the Data Scenarios via, Erwin9.5 logical models and excel mockups to visualize the data better.
  • Loaded data directly from Oracle toNetezzawithout any intermediate files.
  • Designed the Layout for Extraction,DataPump and Replication for prams forData Replication/DataDistribution.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Worked on NoSQL databases including HBase, Mongo DB, and Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Installation and Configuration of other Open Source Software like Pig, Hive, HBase, Flume and Sqoop.
  • Designed and produced client reports using Excel, Access, Tableau and SAS.
  • Work closely with Enterprise architect, product manager and technical team to develop data architecture for a new banking product
  • Developed Master data management strategies for storing reference data
  • Represented conceptual, logical and physical data models using relational and NOSQL databases
  • Created logical and physical data model using Cassandra’s model
  • Installed, configured and created Cassandra Database
  • Fine-tuned performance of Cassandra data base by configuring Commit logs, Mem tables and Cache size
  • Involved inNormalization/De-normalizationtechniques for optimum performance in relational and dimensional database environments.
  • Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle12c, Netezza, flat files, PL/SQL Server with high volumedata.
  • Worked on Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP,StarSchema,SnowFlakeSchema, Fact Table and Dimension Table.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Performed Data Analysis tasks on warehouses from several sources like Oracle11g, Teradata, and XML etc. and generated various reports and documents.
  • Worked with ETLdeveloperfor design and development of Extract, Transform and Load processes for data integration projects to build data marts.
  • Created and implementedMDMdata model for Consumer/Provider for HealthCareMDMproduct from Variant.
  • Performed Data analysis, statistical analysis, generated reports, listings and graphs using SAS Tools-SAS/Base, SAS/Macros and SAS/Graph, SAS/SQL, SAS/Connect, SAS/Access.
  • Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata14.1Database.
  • Strong Knowledge on concepts of Data Modeling Star Schema/Snowflake modeling, FACT& Dimensions tables and Logical & Physical data modeling.
  • Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and TransformationDeveloper.

Environment: ERwin9.5, PL/SQL, ODS, OLAP, OLTP, SAS, MDM, Mongo DB, Tableau, Netezza, Cassandra, Flat Files, Hadoop, HDFS, Pig, Oracle11g, UNIX, Teradata, PL/SQL

Confidential, Livonia, MI

Sr. Data Architect/Data Modeler

Responsibilities:

  • Worked with data compliance teams, data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
  • Performing Source System Analysis, database design,datamodeling for the warehouse layer using MLDM concepts and package layer using Dimensional modeling.
  • Involved in Teradata utilities (BTEQ, Fast Load, Fast Export, Multiload, and Tpump) in both Windows and Mainframe platforms.
  • Monitored and measured data architecture processes and standards to ensure value is being driven and delivered as expected.
  • Developed various QlikViewDataModels by extracting and using thedatafrom various sources files, Excel, and Bigdata, Flat Files.
  • Developing strategies for data acquisitions, archive recovery, and implementation of databases and working in a data warehouse environment, which includes data design, database architecture, and Metadata and repository creation.
  • Involved in reviewing business requirements and analyzing data sources form Excel/Oracle SQL Server2012 for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop to Netezza database.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Involved in translating business needs into long-term architecture solutions and reviewing object models, data models and metadata.
  • Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Transformed Logical Data Model to Erwin Physical Data Model ensuring the Primary Key and Foreign Key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index Considerations.
  • Extensively developed Oracle10g stored packages, procedures, functions and database triggers using PL/SQL for ETL process, data handling, logging, archiving and to perform Oracle back-end validations for batch processes.
  • Used Netezza SQL, Stored Procedures, and NZload utilities as part of the DWH appliance framework.
  • Worked with the UNIX team and installed TIDAL job scheduler on QA and ProductionNetezza environment.
  • Worked with DBA's to create a best-fit Physical Data Model from the logical data model.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Worked in development and maintenance using OracleSQL,PL/SQL,SQLLoader, and Informatica Power Center9.1.
  • Used Erwin9.1 for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Coordinated with Data Architects and Data Modelers to create new schemas and view inNetezza for to improve reports execution time, worked on creating optimized Data-Mart reports.
  • Extensively used SQL Loader to load data from the Legacy systems into Oracle databases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables.

Environment: ERwin9.1, Teradata, Oracle10g, PL/SQL, Hadoop, HDFS, MS Visio, Flat Files, OLTP, OLAP, Netezza

Confidential, Chicago, IL

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and produced Logical /Physical Data Models.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including SQL Server and Teradata13.1.
  • Involved inNormalization/De-normalization, Normal Form and database design methodology. Expertise in using data modeling tools like MS Visio and Erwin Tool for logical and physical design of databases.
  • Involved in data analysis and modeling for the OLAP and OLTP environment.
  • Planned, scheduled, and controlled projects based on plans and requirements outlined by the business.
  • Conducted JAD Sessions with the SME, stakeholders and other management teams in the finalization of the User Requirement Documentation.
  • Reviewed Business Requirement Documents and the Technical Specification.
  • Involved in writing test plans and test cases using Mercury Quality Center.
  • Coordinated with Track Leads and Project Manager to setup the pre-validation and validation environment to execute the test scripts.
  • Participated in writing data mapping documents and performing gap analysis on the systems.
  • Involved in Dimensional modeling (StarSchema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Used extensively BaseSAS,SAS/Macro,SAS/SQL, and Excel to develop codes and generated various analytical reports.
  • Used Model Manager Option in Erwin to synchronize the data models in Model Mart approach.
  • Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Involved in integration of various relational and non-relational sources such as Teradata, SFDC, Netezza, SQL Server, COBOL, XML and Flat Files.
  • Worked in importing and cleansing of data from various sources like Teradata, flat files, SQL Server2008 with high volume data.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Developed the code as per the client's requirements usingSQL,PL/SQLand Data Ware housing concepts.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of TSQL queries and Stored Procedures.
  • Used Performance Monitor and SQL Profiler to optimize queries and enhanced the performance of database servers.
  • Created SSIS packages to export data from text file to SQL Server Database.
  • Created new replications, DTS packages to maintain high availability of the data between the servers.
  • Gathered Business requirements by organizing and managing meetings with business stake holders, Application architects, Technical architects and IT analysts on a scheduled basis
  • Analyzed the business requirements by dividing them into subject areas and understood the dataflow within the organization
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Created and maintained Metadata, including table, column definitions
  • Involved in purpose of this project is to migrate the Current Optum Rx Data Warehouse from the database environment to aNetezzaappliance.

Environment: Erwin 8, SQL Server 2008, OLAP, OLTP, Flat Files, Metadata, Taradata13, PL/SQL, BTEQ, DB2, Oracle, SQL, Teradata, Netezza, UNIX, Microsoft SQL Server, TOAD.

Confidential, Charlotte, NC

Data Analyst/Data Modeler

Responsibilities:

  • DevelopedSQLandPL/SQLscripts for migration of data between databases.
  • Involved in requirement analysis, ETL design and development for extracting data from the source systems likeDB2, Oracle, flat files and loading intoNetezza.
  • Worked on MYSQL database on simple queries and writing Stored Procedures forNormalizationDe-normalization.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin / Star Schema.
  • Generated customized reports usingSAS/MACRO facility, PROC REPORT, PROC TABULATE and PROC SQL.
  • Created data flow diagrams and process flow diagrams for various load components like FTP Load, SQL Loader Load, ETL process and various other processes that required transformation.
  • Prepared strategy to implement ETL code in production and load historical as well as incremental loads data in production.
  • Analyzed user statement of requirements to develop new reports.
  • Analyzed user requirements, attended change request meetings to document changes and implemented procedures to test changes.
  • Gathering statistics of data objects using Analyze and DBMS STATS.
  • Involved in mapping spreadsheets that will provide the Data Warehouse Development (ETL) team with source to target data mapping, inclusive of logical names, physical names, data types, domain definitions, and corporate meta-data definitions.
  • Converted physical database models from logical models, to build/generate DDL scripts.
  • Created mapping document and mappings inMDMHUB ORS using various cleanse list and cleanse functions.
  • Strong knowledge of database such as Oracle, DB2,Netezza.
  • Coordinating with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production
  • Knowledge of Merging several flat files into one XML file.
  • Extensively worked on creating, Altering and Deleting the Tables in different Development Environments and also Production.
  • Extensively worked with logical models to change them into complete physical models from where we can implement these models into database.
  • Extensively worked with developers on analyzing the impacted changes on respective applications based on which design approach is taken and also same changes are implemented in database with help of DBA.
  • In depth analyses ofdatareport was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX.

Environment: Erwin 7.0, Informatica ETL, PL/SQL, Netezza, DB2, Oracle, Flat Files, SAS, ETL, Meta Data, MYSQL, Unix, MS Excel.

We'd love your feedback!