We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler/data Architect Resume

2.00/5 (Submit Your Rating)

Buffalo, NY

SUMMARY:

  • Over 11 years of Experience in Data Architecture, Design, Development and Testing of business application systems, Data Analysis and developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems.
  • Experienced working with data modeling tools like Erwin, Power Designer and ER Studio.
  • Experienced in designing star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Experienced in Data Modeling &Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
  • Experienced in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
  • Experienced in Data Profiling, Analysis by following and applying appropriate database standards and processes, in definition and design of enterprise business data hierarchies.
  • Very good knowledge and experience on AWS, Redshift, S3 and EMR.
  • Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Proficient with Data Analysis, mapping source and target systems for data migration efforts and resolving issues relating to data migration.
  • Excellent development experience SQL, Procedural Language(PL) of databases like Oracle, Teradata, Netezza and DB2
  • Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL
  • Experienced in creating and documenting Metadata for OLTP and OLAP when designing a systems.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Excellent Knowledge of Ralph Kimball and Billion’s approaches to Data Warehousing.
  • Extensive experience in development of T - SQL, DTS, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Experienced using query tools like SQL Developer, PLSQL Developer, and Teradata SQL Assistant.
  • Excellent in performing data transfer activities between SAS and various databases and data file formats like XLS,CSV,DBF,MDB etc.
  • Expertise in developing standard and re-usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router.
  • Expertise in designing complex Mappings and have expertise in performance tuning and slowly-changing Dimension Tables and Fact tables
  • Extensively worked with Teradata utilities BTEQ, Fast Export, and Multi Load to export and load data to/from different source systems including flat files.
  • Experienced in big data analysis and developing data models using Hive, PIG, and Map reduce, SQL with strong data architecting skills designing data-centric solutions.
  • Excellent working knowledge of COBOL and IBM Mainframes
  • Expertise in extracting, transforming and loading data between homogeneous and heterogeneous systems like SQL Server, Oracle, DB2, MS Access, Excel, Flat File and etc. using SSIS packages.
  • Experienced in Performance tuning on Oracle databases by leveraging explain plans, and tuning SQL queries.
  • Oversaw and worked with team on data migration and CRM implementation
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
  • Excellent experience in creating cloud based solutions and architecture using Amazon Web services and Microsoft Azure.
  • Expertise in analyzing and documenting business requirement documents (BRD) and functional requirement documents (FRD) along with Use Case Modeling and UML.
  • Experience in UNIX shell scripting, Perl scripting and automation of ETL Processes.
  • Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE Methodologies.
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 12c/11g/10g/9i/8i, IBM DB2 UDB, XML files.
  • Experienced in developing Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Experienced with modeling using ERWIN in both forward and reverse engineering cases.
  • Good in Data warehouse loads, determining hierarchies, building various logics to handle Slowly Changing Dimensions.
  • Excellent Team player to work in conjunction with Business analysts, Production Support teams, Subject Matter Experts, Database Administrators and Database developers.
  • Exceptional problem solving and sound decision making capabilities, recognized by alternative solutions.

TECHNICAL SKILLS:

Analysis and Modeling Tools: Erwin 7.2/7.0, 8.0, r9, Sybase Power Designer, Oracle Designer, Erwin, Rational Rose, ER/Studio, TOAD, MS Visio, SASETL Tools: Informatica Power Center, Data Stage 7.5, Ab Initio, Talend

OLAP Tools: MS SQL Analysis Manager, DB2 OLAP, Cognos Powerplay

Languages: SQL, PL/SQL, T-SQL, XML, HTML, UNIX Shell Scripting, C, C++, AWK

Databases: Oracle12c/11g/10g/9i/8i/8.0/7.x,Teradata14.0,DB2 UDB 8.1, MS SQL Server 2008/2005, Netezaa 4.0 and Sybase ASE 12.5.3/15,Informix 9

Operating Systems: Windows 2007/8, UNIX (Sun-Solaris, HP-UX), Windows NT/XP/Vista, MSDOS

Project Execution Methodologies: Ralph Kimball and Bill Inmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

Tools: TOAD, MS Office, BTEQ, Teradata SQL Assistant

Methodologies: Ralph Kimball, COBOL

Reporting Tools: Business ObjectsXIR 2/6.5/5.0/5.1, Cognos Impromptu 7.0/6.0/5.0,Informatica Analytics Delivery Platform, Micro Strategy, SSRS, Tableau

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS

Programming Languages: SQL, T-SQL, Base SAS and SAS/SQL, HTML, XML

PROFESSIONAL EXPERIENCE:

Confidential, Buffalo, NY

Sr. Data Analyst/Data Modeler/Data Architect

Responsibilities:

  • Provide data architecture support to enterprise data management efforts, such as the development of the enterprise data model and master and reference data, as well as support to projects, such as the development of physical data models, data warehouses and data marts.
  • Lead the strategy, architecture and process improvements for data architecture and data management, balancing long and short-term needs of the business.
  • Created and maintained Logical and Physical models for the data mart and created partitions and indexes for the tables in the data mart.
  • Providing technical and architectural subject matter expertise to the various development teams including communicating architectural decisions and mentoring other technical staff around the various development technologies and decisions.
  • CRM key contact and data manager for SAP-CRM system, including daily maintenance, instruction and training for all users, and direct relations with IT.
  • Managed CRM - responsible for retention campaigns from conceptualization through to analysis.
  • Responsible for Internal Marketing & CRM applications configuration, workflow development, role maintenance, module builder, reports etc.
  • Developed and introduced new features / capabilities into CRM applications by analyzing information needs and functional requirements to deliver the needed artifacts
  • Increased systems user adoption rates to 3X of normal usage by reorganizing training program through the creation of CRM training manual, job guides, product training site, and implementing system requirements based off on user-functionality requests.
  • Successfully migrated acquired firms CRM platforms (Salesforce, Red Hat etc.) well before project deadlines by delivering GAP analysis, functional specifications, data mapping, function mapping for CRM leading to increased user satisfaction.
  • Performed data profiling and analysis applied various data cleansing rules designed data standards and architecture/designed the relational models.
  • Create new data designs and make sure they fall within the realm of the overall Enterprise BI Architecture.
  • Building relationships and trust with key stakeholders to support program delivery and adoption of enterprise architecture.
  • Maintained metadata(data definitions of table structures) and version controlling for the data model.
  • Working with technical analysts and software developers to identify and design data models based on requirements definitions and interactive discussions to support both new and existing application system processes
  • Providing technical leadership, mentoring throughout the project life-cycle, developing vision, strategy, architecture and overall design for assigned domain and for solutions
  • Responsible for the development of target data architecture, design principles, quality control, and data standards for the organization
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Metadata Architect responsible for providing Metadata Strategy and design using Informatica Metadata Manager & Info Sphere Business Glossary
  • Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository
  • Working on a Map RHadoop platform to implement big data solutions using Hive, Map reduce, shell scripting and Pig.
  • Defined and implemented data quality maintenance processes
  • Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
  • Worked with cloud based technology like Redshift, S3, AWS, EC2 Machine, etc.
  • Analyze and study the source system data models to understand concept tie-outs such that the integration process into existing data warehouse is seamless and data redundancy is eliminated.
  • Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Perform data profiling and data analysis to enable identify data gaps and familiarize with new source system data
  • Participated in integration of MDM (Master Data Management) Hub and data warehouses.
  • Extensively used Aginity Netezza work bench to perform various DML, DDL etc. operations on Netezza database.
  • Design dimensional models that can reduce development time for Info Sphere Data Warehouse and IBM Cognos Business Intelligence reporting.
  • Developed multiple MapReduce jobs in java for Data Cleaning and pre-processing analyzing data in PIG.
  • Analyze existing source system with the help of Data Profiling and source system data models thus creating individual data models for various domains/subject areas for the proposed data warehouse solution.
  • Transforming staging area data into a STAR schema (hosted on Amazon RedShift) which was then used for developing embedded Tableau dashboards
  • Translating business requirements into SAS code for use within internal systems and models
  • Extracting the data from the Oracle financials and the Redshift database.
  • Performed Unit Testing, System Integrated Testing for the aggregate tables.
  • Scheduling the SSIS packages execution in SQL Server Agent and tracking the success or failure of the execution of SSIS Packages. And configuring the email notifications through SQL Server Agent.
  • Performed data analysis on the target tables to make sure the data as per the business expectations.
  • Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.
  • Developed SQL scripts for loading data from staging area to Target tables.
  • Proposed the EDW data design to centralize the data scattered across multiple datasets.
  • Worked on the development of Data Warehouse, Business Intelligence architecture that involves data integration and the conversion of data from multiple sources and platforms.
  • Worked on migrating of EDW to AWS using EMR and various other technologies.
  • Worked on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export
  • Worked on SQL and SAS script mapping.
  • Follow the Type 2-dimension methodology to accommodate designing and maintaining for history data.
  • Analyzed all existing SSIS packages, SQL Server objects & new functional specs.
  • Used Netezza groom to reclaim the space for tables, databases and extensively managed the data skew all the Netezza database tables.
  • Used Meta data tool for importing metadata from repository, new job categories and creating new data elements.
  • Research on the attained database space savings as and when a module has been released and come out with numbers before and after the release.
  • Extensively using Agile methodology as the Organization Standard to implement the data Models.
  • Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
  • Environment: Oracle 12c, SQL Plus, Erwin, MS Visio, SAS, Source Offsite (SOS), Hive, PIG, Windows XP, AWS, QC Explorer, Share point workspace, Python, Teradata, Oracle, Agile, Data Stage, MDM, Netezza, IBM Info sphere, SQL, PL/SQL, IBM DB2, SSIS, Power BI, Redshift, Business Objects XI3.5,COBOL,SSRS, Quick Data.

Confidential, Chicago, IL

Sr. Data Architect/Data Modeler

Responsibilities:

  • Architect and design, solutions for complex business requirements, including data processing, analytics and ETL and reporting processes to improve performance of data loads and processes.
  • Develop a high performance, scalable data architecture solution that incorporates a matrix of technology to relate architectural decision to business needs.
  • Participated in the design, development, and support of the corporate operation data store and enterprise data warehouse database environment.
  • Conducting strategy and architecture sessions and deliver artifacts such as MDM strategy (Current state, Interim State and Target state) and MDM Architecture (Conceptual, Logical and Physical) at detail level.
  • Owned and managed all changes to the data models, Created data models, solution designs and data architecture documentation for complex information systems.
  • Provides leadership around CRM product capability discovery and gathering of Marketing, IT, Compliance, Security, and Risk requirements for the CRM product management team to determine impacts, SOP updates, and data flow time.
  • Doing analysis using SQL and SAS for the CRM team with data pulled from the CRM Data Mart
  • Assisted with the management and organization of a $525-million investment portfolio through the utilization of multiple CRM and account trading software.
  • Analyze change requests for mapping of multiple source systems for understanding of Enterprise wide information architecture to devise Technical Solutions.
  • Worked with SME's and other stakeholders to determine the requirements to identify Entities and Attributes to build Conceptual, Logical and Physical Data Models.
  • Provided data sourcing methodology, resource management and performance monitoring for data acquisition.
  • Ensured reports are optimized for efficiency and negative performance on production environment is mitigated.
  • Documented the SAS project report notice, which included the indications of purpose of the requirement, from where it is been read and written to, what kind of process it is, how the process runs and finally the installation instruction for the upgraded data dictionaries and the new or modified program
  • Supported and followed information governance and data standardization procedures established by the organization. Documents reports library as well as external data imports and exports.
  • Prepared Tableau reports and dashboards with calculated fields, parameters, sets, groups or bins and publish on the server.
  • Extensively used Netezza utilities like NZLOAD and NZSQL and loaded data directly from Oracle to Netezza without any intermediate files.
  • Performed analysis of data sources and processes to ensure data integrity, completeness and accuracy.
  • Created a logical design and physical design in Erwin.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Architecture design by effective data modeling, implementing database standards and processes.
  • Developed Data Mapping, Data Governance, and transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Generated ad-hoc reports using OBIEE.
  • Worked with Metadata Definitions, Import and Export of Data stage jobs using Data stage Manager.
  • Responsible for migrating the data and data models from SQL server environment to Oracle environment.
  • Analysis and designing the ETL architecture, creating templates, training, consulting, development, deployment, maintenance and support.
  • Created SSIS Packages which loads the data from the CMS to the EMS library database.
  • Involved in data modeling and providing technical solutions related to Teradata to the team.
  • Developed SQL Queries to get complex data from different tables in Hemisphere using joins, database links.
  • Prepared of Data Dictionary/Meta Data of the data elements.
  • Designed the physical model for implementing the model into Oracle 11g physical data base.
  • Wrote SQL queries, PL/SQL procedures/packages, triggers and cursors to extract and process data from various source tables of database.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Query UDW using AQT or Teradata SQL assistant and profile data to understand relationships between data objects and domain dependencies
  • Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Involved in mapping the data elements from the User Interface to the Database and help identify the gaps.
  • Used Erwin to create logical and physical data models for enterprise wide OLAP system.
  • Designing and customizing data models for Data warehouse supporting data from multiple sources on real time. Requirements elicitation and Data analysis. Implementation of ETL Best Practices.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
  • Developed complex SQL scripts for Teradata database for creating BI layer on DW for Tableau reporting.
  • Documented the source to target spread sheets which have all the information of source and target fields and all the necessary transformation rules which are in turn used for CCS reports development.
  • Worked on debugging and identifying the unexpected real-time issues in the production server SSIS packages.
  • Worked with ad-hoc reporting using Teradata SQL
  • Extensively used MS Visio for representing existing and proposed data flow Diagrams.

Environment: Erwin, MS Visio, Oracle 11g, Oracle Designer, MDM, Power BI, SAS, SSIS, Tableau, Tivoli Job Scheduler, SQL Server 2012, DATAFLUX 6.1, PL/SQL, SQL/PL SQl, SSRS, Data Stage, SQL Navigator Crystal Reports 9, Netezza.

Confidential, Omaha, NE

Sr. Data Architect/Data Analyst/Data Modeler

Responsibilities:

  • Design and develop data warehouse architecture, data modeling/conversion solutions, and ETL mapping solutions within structured data warehouse environments
  • Reconcile data and ensure data integrity and consistency across various organizational operating platforms for business impact.
  • Define best practices for data loading and extraction and ensure architectural alignment of the designs and development.
  • Used Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Involved in preparing Logical Data Models/Physical Data Models.
  • Worked extensively in both Forward Engineering as well as Reverse Engineering using data modeling tools.
  • Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Identifying inconsistencies or issues from incoming HL7 messages, documenting the inconsistencies, and working with clients to resolve the data inconsistencies
  • Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents and analyzing the database using SQL queries.
  • Extensively used both Star Schema and Snow flake schema methodologies in building and designing the logical data model in both Type1 and Type2Dimensional Models.
  • Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward Engineering.
  • Conducting HL7 integration testing with clients systems that is testing of business scenarios to ensure that information is able to flow correctly between applications.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Used Teradata SQL Assistant, Teradata Administrator, and PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Tpump on UNIX/Windows environments and running the batch process for Teradata.
  • Created Dashboards on Tableau from different sources using data blending from Oracle, SQL Server, MS Access and CSV at single instance.
  • Used the Agile Scrum methodology to build the different phases of Software development life cycle.
  • Documented logical, physical, relational and dimensional data models. Designed the data marts in dimensional data modeling using star and snowflake schemas.
  • Created dimensional model based on star schemas and designed them using Erwin.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Carrying out HL7 interface unit testing aiming to confirm that HL7 messages sent or received from each application conform to the HL7 interface specification.
  • Used tools such as SAS/Access and SAS/SQL to create and extract oracle tables.
  • Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects.
  • Enabled the SSIS package configuration to make the flexibility to pass the connection strings to connection managers and values to package variables explicitly based on environments.
  • Responsible for Implementation of HL7 to build Orders, Results, ADT, DFT interfaces for client hospitals
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects.
  • Used OBIEE to create reports.
  • Worked on data modeling and produced data mapping and data definition specification documentation.

Environment: Erwin, Oracle, SQL server 2008, Power BI, MS Excel, Netezza, Agile, MS Visio, Rational Rose, Requisite Pro, SAS, SSIS, SSRS, Windows 7, PL/SQL,, SQl Server, MDM, Teradata, MS Office, MS Access, SQL, SSIS, MS Visio, Tableau.

Confidential, Salem, NC

Sr. Data Modeler/Data Analyst/Architect

Responsibilities:

  • Designed logical and physical data models for multiple OLTP and Analytic applications.
  • Created business data constraints, indexes, sequences etc. as needed in the Physical data model.
  • Extensively used the Erwin design tool &Erwin model manager to create and maintain the Data Mart.
  • Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models
  • Created stored procedures using PL/SQL and tuned the databases and backend process.
  • Involved with Data Analysis primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats
  • Performance tuning of the database, which includes indexes, and optimizing SQL statements, monitoring the server.
  • Wrote SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Created new HL7 interface based on the requirement using XML, XSLT technology.
  • Experienced in creating UNIX scripts for file transfer and file manipulation.
  • Utilized SDLC and Agile methodologies such as SCRUM.
  • Data Stage jobs were scheduled, monitored, performance of individual stages was analyzed and multiple instances of a job were run using Data Stage Director.
  • Led successful integration of HL7 Lab Interfaces and used expertise of SQL to integrate HL7 Interfaces and carried out detailed and various test cases on newly built HL7 interface.
  • Wrote simple and advanced SQL queries and scripts to create standard and ad hoc reports for senior managers.
  • Used Expert level understanding of different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.
  • Designed and Developed PL/SQL procedures, functions and packages to create Summary tables.
  • Environment: SQL Server, UML, Business Objects 5, Teradata, Windows XP, SSIS, SSRS, Embarcadero, ER studio, Erwin, DB2, Informatica, HL7, Oracle, Query Management Facility (QMF), SSRS, Data Stage, Clear Case forms, SAS, Agile, Unix and Shell Scripting.

Confidential, Charlotte, NC

Data Analyst/Data Modeler

Responsibilities:

  • Developed Data Mapping, Data Governance and transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Created new conceptual, logical and physical data models using Erwin and reviewed these models with application team and modeling team.
  • Performed numerous data pulling requests using SQL for analysis.
  • Created databases for OLAP Metadata catalog tables using forward engineering of models in Erwin.
  • Strengthened the development and implementation of bank’s Wealth Management’s technology roadmap by supporting the planning and analysis of business requirements for ongoing system changes and enhancements to the organization’s CRM system. CRM system administrator responsible for working in tandem with engineering, data analysts and business analysts to create and translate requirements into solution design, participate in the system development process life-cycle from concept through testing, implementation, and support using the Agile development methodology.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Proficient in importing/exporting large amounts of data from files to Teradata and vice versa.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.
  • Utilized ODBC for connectivity to Teradata& MS Excel for automating reports and graphical representation of data to the Business and Operational Analysts.
  • Extracted data from existing data source, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MS Excel.
  • Extracted data from existing data source and performed ad-hoc queries.
  • Used BETQ to run and Teradata SQL scripts to create physical data model.

Environment: UNIX scripting, Oracle SQL Developer, SSRS, SSIS, Teradata, Windows XP, SAS data sets

We'd love your feedback!