Sr. Data Architect/data Modeler Resume
St Louis, MO
SUMMARY:
- Above 9+ years of IT experience as Data Architect/Data Modeler & Data Analyst in the Architecture, Design and Development.
- Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice - versa.
- Strong experience with different project methodologies including Agile Scrum Methodology and Waterfall methodology.
- Experience in Creating Derived fields using calculated functions and parameters in Tableau
- Excellent Experience with Big data technologies like Hadoop, Big Query, MongoDB, Hive, HBase, Pig, Cassandra, MongoDB
- Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra, Mongo DB, and HBase
- Experienced in designing the Conceptual, Logical and Physical data modeling using Erwin and ER Studio, Sybase Power Designer tools.
- Strong experience with architecting highly performance databases using PostgreSQL, PostGIS, MYSQL and Cassandra
- Experience in Data Architecture, Data Modeling, designing and Data Analysis with Conceptual, Logical and Physical Modeling for Online Transaction Processing (OLTP), Online Analytical Processing (OLAP) and Data Warehousing.
- Strong experience in Normalization (1NF, 2NF, 3NF and BCNF) and Denormalization techniques for effective and optimum performance
- Experienced in SAS interacts with third party software such as databases (DB2, SQL SERVER, ORACLE), web Application Servers and SAS HPA architecture etc.
- Good knowledge of Data Marts, Operational Data Store (ODS), Dimensional Data Modeling with Ralph Kimball Methodology using Analysis Services.
- Experience in working with Business Intelligence and Enterprise Data Warehouse(EDW) including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum, Amazon Redshift and Azure Data Warehouse
- Experienced in setting up connections to different Databases like Oracle, SQL, DB2, HADOOP, Teradata and NETEZZA according to users requirement
- Extensively experience on EXCEL PIVOT tables to run and analyze the result data set and perform UNIX scripting.
- Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS)
- Strong understanding of the principles of Data warehousing, Fact Tables and Dimension Tables.
- Experience in using Dynamic Management Views (DMV) and Dynamic Management Functions (DMF) writing SQL and PL/SQL statements, TSQL scripts, Triggers and Store Procedure for troubleshooting.
TECHNICAL SKILLS:
Data Modeling Tools: Erwin R9.6.1/9.5, Sybase Power Designer, ER Studio and Oracle Designer
BigData& NOSQL: Hadoop, YARN, SQOOP, Flume, Kafka, Splunk, Hive, Pig, Oozie, Storm, Cassandra, HBase, MapReduce, Impala.
Database Tools: Microsoft SQL Server 2016/2014, Teradata 15.0, Oracle 12c/11g, DB2.
BI Tools: Tableau, Tableau server, Tableau Reader, SAP Business Objects, Crystal Reports
Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server
Operating Systems: Microsoft Windows 7/8/XP, Linux, UNIX.
Version Tool: VSS, SVN, CVS.
Tools: & Utilities: TOAD 9.6, Microsoft Visio 2010.
Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester.
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.
PROFESSIONAL EXPERIENCE:
Confidential, St. Louis, MO
Sr. Data Architect/Data Modeler
Responsibilities:
- Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
- Involved in Technical consulting and end-to-end delivery with architecture, datamodeling,data governance and design - development - implementation of solutions.
- Used Agile Methodology of Data Warehouse development using Kanbanize.
- Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, loaded data into HDFS and Extracted the data from MYSQL into HDFS using Sqoop
- Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology using Erwin
- Involved in reviewing business requirements and analyzing data sources form Excel/Oracle, SQL Server2014 for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop to Netezza database.
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
- Create MDM base objects, Landing and Staging tables to follow the comprehensive data model in MDM.
- Working on Amazon Redshift and AWS and architecting a solution to load data, create data models and run BI on it.
- Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Involved with Key-value data modeling, data load process and classify the key business drivers for the data management initiative.
- Ensured high-quality data and understand how data is generated out experimental design and how these experiments can produce actionable, trustworthy conclusions.
- Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
- Involved in the validation of the OLAP, Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Connected to Amazon Redshift through Tableau to extract live data for real time analysis.
- Worked in NoSQL database on simple queries and writing Stored Procedures for Normalization and De-normalization.
- Successfully loaded files to Hive and HDFS from Oracle and Involved in loading data from UNIX file system to HDFS and involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Involved in translating business needs into long-term architecture solutions and reviewing object models, data models and metadata.
- Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.
- Worked with DBAs to create a best fit Physical Data Model from the Logical Data Model using Erwin9.5.
Environment: ERwin9.6, Oracle12c, PL/SQL, UNIX, Agile, TIDAL, MDM, ETL, BTEQ, SQL Server2014, Netezza, DB2, SAS, Tableau, NoSQL, UNIX, SSRS, SSIS, T-SQL, MDM, Informatica
Confidential, Brentwood, TN
Sr. Data Architect/Data Modeler
Responsibilities:
- Developed a high performance, scalable data architecture solution that incorporates a matrix of technology to relate architectural decision to business needs.
- Lead the development and implementation of the logical data model and physical data design utilizing data architecture and modeling standards.
- Responsible for interaction with business stake holders, gathering requirements and managing the delivery, covering the entire Tableau development life cycle.
- Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Develop TSQL scripts to create database objects and perform DML and DDL tasks and wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Developed various QlikView Data Models by extracting and using the data from various sources files Excel, Flat Files and Big data.
- Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Oracle, database.
- Worked on SAS for Data Analysis and involved in importing and cleansing of data from various sources like Teradata, Oracle, flat files, Netezza, with high volume data.
- Developed reports using SQL Server Reporting Services (SSRS) and involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata.
- Involved in providing production support to variousAbInitio jobs and developing various UNIX shell wrappers to runAbInitioand Data base jobs.
- Performing code reviews for ETL mappings from a performance & Error handling perspective.
- Collaborating with business users to define the key business requirements and translate them into process and technical solutions.
- Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
Environment: PowerDesigner, RDBMS, HDFS, Sqoop, Excel, AbInitio, MS Access, UNIX shell, BigData/Hadoop.
Confidential, New York, NY
Sr. Data Analyst /Data Modeler
Responsibilities:
- Involved indataprofiling anddatacleansing to eliminatedataredundancy and enhance performance of the database.
- Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
- Created DDL scripts using ER Studio and source to target mappings to bring thedatafrom source to the warehouse.
- Worked on Conceptual, Logical Modeling and Physical Database design for OLTP and OLAP systems.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ER Studio.
- Involved in writing T-SQL, working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Data analysis and reporting using MY SQL, MS Power Point, MS Access and SQL assistant.
- Involved in writing scripts for loading data to target data Warehouse using BTEQ, Fast Load, Multiload
- Worked onDatamodeling using DimensionalDataModeling, Star Schema/Snow Flake schema, and Fact & Dimensional, Physical & Logicaldatamodeling.
- Extensively involved in Recovery process for capturing the incremental changes in the source systems for updating in the staging area and data warehouse respectively
- Involved in creating dashboards and reports in Tableau and Maintained serveractivities, user activity, and customized views onServerAnalysis.
- Developed UNIX Shell Scripts for database code change deployment to various environments and Job Scheduling.
- Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
- Developed multiple data flow tasks for different packages usingMSSQL server integration services.
- Created tables and views using T-SQLto pull data for creating daily reports for interval level call data.
- Involved in designing and implementing the data extraction (XML DATA stream) procedures.
- Involved in MY SQL, MS Power Point, MS Access Database design and design new database on Netezza which will have optimized outcome.
- Used SQL assistant tool extensively to profile data and check mapping accuracy.
- Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
Environment: ER/Studio 8.0, PL/SQL, Informatica 8.x, Oracle 11g, Netezza, UNIX, TIBCO Spotfire, Tableau 8RDBMS, T-SQL, Tableau, MSSQL server 2005, LINUX, DBA.
Confidential, McLean, VA
Data Analyst /Data Modeler
Responsibilities:
- Used Erwin created Conceptual, Logical and Physical data models.
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Wrote ad-hoc SQL queries and worked with SQL and Netezza databases
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization
- Created DDL scripts for implementing Data Modeling changes.
- Created a list of domains in Erwin and worked on building up the data dictionary for the company
- Migrated data from DBA database from a LINUX environment toMSSQL server 2005using ODBC driver.
- Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
- Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
- Designed and Review of Various documents including Software Requirement Specification (SRS), Business Requirement Document (BRD), Data Functional Design (DFD) and Functional System Design (FSD).
- Applied data naming standards, created the data dictionary and documented data model translation decisions and also maintained DW metadata.
- Extensively used SQL for performance tuning.
- Developed and executed queries in SQL for reporting using SQL Manager
Environment: Erwin8.0, Oracle 10g, ODS, OLAP, OLTP, Star Schema, Snowflake Schema, NoSQL, Business Objects, Agile, DB2, OLAP, SAP
Confidential
Data Analyst
Responsibilities:
- Gathered and translated business requirements into detailed, Business Requirement Document and Functional Requirement Specifications and involved in analyzing them.
- Developed logical data models and their associated metadata based on the business rules and requirements
- Documented and published the data models and their associated metadata.
- Assisted the application teams with the data needs related to new releases or other changes and updated the data models as needed
- Involved in mapping data from one store to another as may be required for data migrations, data loads, data integration
- Identified possible sources for data population and promoted the use of shared data within a business area
- Worked on the reporting requirements and involved in generating the reports for the Data Model.
- Redefined attributes and relationships in the model and cleansed unwanted tables/columns as part of data analysis responsibilities.
- Designs and or developed database objects (databases, tables, stored procedures, DTS Packages) to support the collection, tracking and reporting of business data.
- Worked extensively on Data Quality (running Data Profiling, Examine Profile outcome); Metadata management (loading metadata, mapping metadata, or perform data linkage)
- Involved in deploying in the production environment along with developers and providing production support for data when issues arise.
Environment: Erwin 7.0, Oracle 9i, Crystal Reports, SQL Server 2005, Windows XP, Oracle SQL Developer, SSRS, SSIS, MS Excel