We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

0/5 (Submit Your Rating)

Livonia, MI

SUMMARY

  • 9+ years of Industry experience as a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, OBIEE, ETL, BI, OLAP, Client/Server applications.
  • Expertise in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008 and TeradataV2R6/R12/R13.
  • Expertise in designing Data Architecture,DataAnalysis, Oracle PL/SQL programming.
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
  • Excellent knowledge in Data Analysis, Data Validation, Data Profiling, Data Cleansing, Data Verification and identifying data mismatch.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata V2R6/R12/R13.
  • Proficiency in configuring, implementing and working with Oracle BI Apps and the respective ETL, ELT tools such as Informatica, ODI
  • Strong Hands on experience on Informatica Power center and Informatica Analyst tool and Informatica Data Quality (IDQ).
  • Extensive experience in ITSM, BMC Remedy AR System and BMC tools which includes development, Administration, Configuration, Integration and Testing in Remedy
  • Expertise in data warehouse and OBIEE, creation of schemas (star schema modeling, snowflake schema modeling), Fact and Dimension, Slowly changing dimensions and Soft delete, cross - reporting, various performance tuning techniques such as horizontal and vertical federation, content level settings.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
  • Experienced in Data masking using various tools.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Experienced on creation of reports, dashboards, and ad- hoc analysis using the reporting tool Tableau.
  • Excellent working knowledge on using Oracle Enterprise Asset Management (EAM).
  • Excellent knowledge in PL/SQL, managed scripts and code for creating, populating, updating and truncating tables which are stored in database such as Oracle, Teradata, SQL Server etc.
  • Extensive working experience in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating Database Objects like tables, Constraints (Primary key, Foreign Key, Unique, Default), Indexes.
  • Extensively worked on ERWIN tool with all features like REVERSE Engineering, FORWARD Engineering, SUBJECT AREA, DOMAIN, Naming Standards Document etc.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica PowerCenter.
  • Experienced in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent knowledge on creating reports on Pentaho Business Intelligence, Tibco Spotfire & Tableau.
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • An excellent team player & technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS

Data Warehousing: Informatica (Repository Manager, Designer, Pentaho (BI), Workflow Manager, and Workflow Monitor), SSIS.

Reporting Tools: Business Objects6.5, XIR2

Data Modeling: Erwin, ER Studio, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables.

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

RDBMS: Oracle 12c/11g/10g/9i/8i/7.x, MS SQL Server, UDB DB2 9.x, Teradata V2R6/R12/R13, MS Access 7.0

Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script

Environment: Windows (95, 98, 2000, NT, XP), UNIX

Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13 SQL Assistant

PROFESSIONAL EXPERIENCE

Confidential, Livonia, MI

Sr. Data Architect/Data Modeler

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • Involved withDataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData,Data Definitions andDataFormats.
  • Create logical and physical data models using Erwin to meet the needs of the organization's information systems and business requirements.
  • Worked on creating profiles, profile models, rules and mappings with IDQ.
  • Created Profiles using IDQ rules and filters.
  • Design and review high level architecture and design ofdatabase components.
  • Analysis of functional and non-functional categorized data elements for Data Migration, data profiling and mapping from source to target data environment.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Built several MDM land to staging mappings in Informatica MDM HUB.
  • Developed working documents to support findings and assign specific tasks.
  • Analyze the Business information requirements and research the OLTP source systems to identify the measures, dimensions and facts required for the reports.
  • Analyzed, mapped and loaded foundation data from legacy ERP to BMC Remedy.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Defined Business rules in Informatica Data Quality (IDQ) to evaluate quality of data by creating cleanse processes to monitor compliance with standards and also identified areas for data quality gaps and assist in resolving data quality issues.
  • Worked in building mappings for populating data into the MDM landing tables by processing the errors as DQ violations and re-processing them.
  • Used BMC Migrator and Importer to load and maintain foundation data.
  • Manage replication across multiple databases and involved with tuning the SQL queries in DAO (data access layer) and optimizing the existing stored procedures, functions,database triggers and materialized views for better performance.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Exported Mappings from IDQ to Informatica Power Center
  • Developed data transformation and cleansing rules for migration using ETL tools.
  • Coordinated data migration from old to new system during Teradata system capacity expansion.
  • Debugged existing MDM outbound views and changed it according to the requirement
  • Architecting & implementing enterprise & business segment BI and OLTP solutions.
  • Performed data mining on data using very complex SQL queries and discovered pattern.
  • Widely used Normalization methods and have done different normalizations (3NF)
  • Used various features of Oracle like Collections, Associative arrays, Bulk-processing methods to write effective code.
  • Defining, designing, developing & optimizing ETL processes for enterprise data warehouse.
  • Worked at conceptual/logical/physical data model level using Erwin according to requirements.
  • Used Developer Studio to create custom standalone applications and to customize out-of-box forms, fields and workflow objects in Remedy ITSM applications.
  • Running Hadoop streaming jobs to process terabytes of xml format data
  • Developed the performance tuning of the database by using EXPLAIN PLAN, TKPROF utilities and also debugging the SQL code.
  • Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN.
  • Resolveddataissues and updates for multiple applications using SQL queries/scripts.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin.

Environment: Erwin 9x, DB2, Oracle 12c, Information Analyzer, Informatica, Pig, MDM, Quality Centre, Excel, Teradata, Metadata, Hive, PL/SQL, UNIX, Netezza, MS-Word, Informatica MDM, SQL.

Confidential, Irving, TX

Sr. Data Architect/Data Modeler

Responsibilities:

  • Worked extensively along with business analysis team in gathering requirements and understanding the work flows of the organization.
  • Develop database system, data architectural designs, and other related databases.
  • Worked with Data Stewards and subject matter experts to research reported data anomalies, identified root causes, and determined appropriate solutions.\
  • Provide support in database architecture of the organization through database design, modeling and implementation.
  • Performance tuning heavy queries and optimizing Informatica MDM jobs.
  • Worked with data source systems and Client systems to identify data issues, data gaps, identified and recommended solutions.
  • Worked on creating SQL queries and performance tuning of queries
  • Worked closely with Architects and developers to deliver the database components for large scale, complex web based applications.
  • Developed several complex mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica Power Center and IDQ.
  • Prepared Logical Data Models using ERWIN that contains set of diagrams and supporting documents containing the essential business elements, detailed definitions, and descriptions of the relationships between the data elements to analyze and document business data requirements.
  • Installed & configured, BMC Remedy ARS 8.1.02 with ITSM suite into a high-availability server group for three environments (DEV, QA, and PROD).
  • Installed BMC Mid-Tier application using Apache Tomcat web servers.
  • Created SAP Transaction iviews, URL iviews, MDM search, resultset and Item details iviews. Customized STD MDM portal content to make sure the Views are users-friendly.
  • Implemented Dimensional model for theDataMart and responsible for generating DDL scripts using ERWIN
  • Developed unit and system test cases, using System Procedures to check data consistency with adherence to the data model defined. Check the data quality using IDQ.
  • Created a high-level industry standard, generalizeddatamodel to convert it into logical and physical model at later stages of the project using Erwin.
  • Designed Metadata Repository to storedatadefinitions for entities, attributes & mappings between datawarehouse and source systemdataelements.
  • Worked on data manipulation and analysis & accessed raw data in varied formats with different methods and analyzing and processing data
  • Performed data modeling and data analysis as required.
  • Assist Remedy On Premise and On Demand customers with Remedy ITSM Service Desk and Foundation issues
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Extracted data from various sources like Oracle,Mainframes, and flat filesand loaded into the targetNetezza, Teradatadatabase.
  • PerformedDatamapping between source systems to Target systems, logicaldatamodeling, created class diagrams and ER diagrams and used SQL queries to filterdata
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Involved indatamining, transformation and loading from the source systems to the target system.
  • Participated in all phases of Remedy (application) life cycle development including requirement gathering, analysis, design and development, testing (QA) and implementation.
  • Supported business areas and database platforms to ensure logical data model and database design, creation, and generation follows enterprise standards, processes, and procedures
  • Provided input into database systems optimization for performance efficiency and worked on full lifecycle of data modeling (logical - physical - deployment)
  • Involved with datacleansing/scrubbing and validation.
  • Performed dicing and slicing ondatausing Pivot tables to acquire the churn rate pattern and prepared reports as required.
  • Documented innovative solutions to workflow problems in the form of public Remedy knowledge base entries.
  • In depth analyses ofdatareport was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX.

Environment: PL/SQL, Informatica 9.x, Oracle 11G, Netezza, Aginity, Hive, ERWIN data modeler, UNIX, SQL, Hadoop, Mainframe, Informatica MDM, Teradata.

Confidential, Mason, OH

Sr. Data Analyst / Data Modeler

Responsibilities:

  • Actively involved in creating Physical and Logical models using Erwin.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Worked on IDQ for Data profiling, cleansing and matching.
  • Responsible for designing, testing, deploying, and documenting the data quality procedures and their outputs.
  • PerformedDatamodeling using Erwin. Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using the forward engineering Erwin tool.
  • Developed Logical and Physicaldatamodels that capture current state/future statedataelements and dataflows using Erwin / Star Schema.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Written several shell scripts using UNIX Korn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
  • Used ERWIN Studio CW Erwin tool for modeling logical models.
  • Metrics reporting, data mining and trends in helpdesk environment using Access.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Extensively used ETL to loaddatafrom flat files (excel/access) to Oracle database.
  • Designed and Developed Oracle PL/SQL and Shell Scripts,DataImport/Export,DataConversions andDataCleansing.
  • Created Logical/PhysicalDatamodels in 3NF in the Warehouse area of EnterpriseData Warehouse.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support using Pentaho Data Integration and SSIS.

Environment: - Erwin, MS SQL Server, Oracle, MS office, Business Objects Clear Quest, Clear Case, SQL, PL/SQL, Informatica, Teradata.

Confidential, Buffalo, NY

Data Analyst/Modeler

Responsibilities:

  • Performeddataanalysis and profiling of sourcedatato better understand the sources.
  • Managed, updated and manipulated report orientation and structures with the use of advanced Excel functions including Pivot Tables and V-Lookups.
  • Used data cleansing techniques, Excel pivot tables, formulas, and charts.
  • Created SAS datasets by extracting data from Excel, Flat Files.
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Involved in mapping spreadsheets that will provide theDataWarehouse Development (ETL) team with source to targetdatamapping, inclusive of logical names, physical names,datatypes, domain definitions, and corporate meta-datadefinitions.
  • Extensively used SQL forDataAnalysis and to understand thedatabehavior.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Created Schema objects like Indexes, Views, and Sequences, triggers, grants, roles, Snapshots.
  • Designed the procedures for getting thedatafrom all systems toDataWarehousing system. The datawas standardized to store various Business Units in tables.
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data.
  • Developed Star Schema and Snowflake Schema in designing the Logical Model into Dimensional Model.
  • Tuning the Informatica Mappings for optimum performance and scheduling ETL Sessions.
  • Conducted several PhysicalDataModel training sessions with the ETL Developers. Worked with them on day-to-day basis to resolve any questions on Physical Model.
  • Extensively worked on documentation ofDataModel, Mapping, Transformations and Scheduling jobs.
  • Involved in ETL mapping documents indatawarehouse projects.

Environment: Quality Center, MS Excel 2007, PL/SQL, Business Objects XIR2, ETL Tools Informatica, Oracle 10G, Teradata R12, Teradata SQL Assistant.

Confidential

Data Analyst

Responsibilities:

  • Used and supported database applications and tools for extraction, transformation and analysis of raw data
  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NF model
  • Developed and programmed test scripts to identify and manage data inconsistencies and testing of ETL processes
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Helped the BI, ETL Developers, Project Manager and end users in understanding theDataModel, dataflow and the expected output for each model created.
  • Wrote simple and advanced SQL queries and scripts to create standard and ad hoc reports for senior managers.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked on enhancements to theDataWarehouse model using Erwin as per the Business reporting requirements.
  • Worked withDataQuality Team in defining and configuration of Rules, Monitoring and preparation of DataQuality Analysis and Dashboards using Trillium.
  • Analyzed the business requirements by dividing them into subject areas and understood thedataflow within the organization
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle, SQL server and DB2.
  • Flexible to work late hours to coordinate with offshore team.

Environment: - Oracle, SQL, Erwin, MS office, Business Objects, Clear Quest, Clear Case, DB2

We'd love your feedback!