We provide IT Staff Augmentation Services!

Data Modeler/data Analyst Resume

4.00/5 (Submit Your Rating)

Dearborn, MI

PROFESSIONAL SUMMARY:

  • Around 8 years of professional experience working as a Data Modeler/Analyst, business system analyst with diverse industry verticals like insurance, banking and service domains.
  • Extensive experience in developing logical and physical data models for both transaction as well as analytical systems.
  • Proficient in Enterprise Data Warehouse. Worked extensively in several projects in both Forward Engineering as well as Reverse Engineering using data modeling tools.
  • Experience with Business Process Modeling, Process Flow Modeling, Data flow modeling.
  • Experience in analyzing the business requirements and creating Business Requirement Document (BRD). Expertise in interacting with stakeholders, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.
  • Proficient in requirement gathering, Organizing JAD sessions, one to one interviews, and focus groups.
  • Well versed in Kimball/Inmon data warehouse Philosophies. Experienced in Normalization/De - normalization techniques for optimum performance in OLTP and OLAP environments.
  • Expertise in Data Modeling using tools likes ERwin, ER Studio and Power Designer.
  • Experience with Teradata using database commands and Teradata utilities such as BTEQ, SQL Assistant, Fast Load, Multi Load, Fast Export etc. Worked extensively on XML Schema designs.
  • Hands on experience with modeling using ERwin in developing Entity-Relationship, modeling Transactional Databases and Data Warehousing, Dimensional Data Modeling for Data Marts and Fact & Dimensional Tables.
  • Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping and data profiling tools.
  • Extensive Experience working on SQL Queries along with good experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
  • Good Understanding and Knowledge with Agile and Waterfall Environments.
  • Involved in performance optimization of DWH solutions, both on ETL as well as Database side.
  • Experience in preparing project cutover plan, test plan with test cases for unit test, regression test and User Acceptance Testing (UAT).
  • Excellent in verbal and written communication, organizational skills combined with the ability to work in a fast paced environment as an individual and/or as a team member.

TECHNICAL SKILLS:

Data Modeling Tools: ERwin 7.1/7.2/7.3, System Architect 11.3, MS Visio 2000/2007/2010, ER/Studio, Power Designer 11.0/12.0/12.5/15.

Databases & Database Tools: MS & SQL servers 2000/2005/2008, Oracle 9i/10g/11g, MS Access 2003/2007/2010, Teradata, My SQL.

Reporting Tools: Cognos 8.3/8.4, Cognos Report Studio, Cognos Analysis Studio, Cognos Framework, MicroStrategy

Operating Systems: Linux, Unix, DOS, Win Server 2003/2008, WIN 98/NT4/0/2000/XP/VISTA./ WIN

Languages: MY SQL, PL/SQL, SQL.

ETL Tools & Others: Informatica Power Center 8.x, Tools Include Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, MS Office, Rational Requisite Pro, SharePoint

PROFESSIONAL EXPERIENCE:

Confidential, Dearborn, MI

Data Modeler/Data Analyst

Responsibilities:

  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
  • Used SQL Loader to load data from external system and developed PL/SQL APIs to dump the data from staging tables into base tables.
  • Source to Target Application Data Mapping based on the business rules.
  • Data Sanity and quantitative analysis on target system.
  • Data movement from Target to ODS Oracle - Operational data store and finally move to Teradata on which Enterprise DataWarehouse is based.
  • Designed and developed dataflow architecture for the building score card, statistical report using Enterprise Data warehouse.
  • Designed Developed BTEQ scripts to load data from the operations data store ODS to the Enterprise data warehouse.
  • Monitored ETL Jobs through informatica and BTEQ logs and was effectively involved in trouble shooting issues.
  • Created test plan and test strategy to regression test the data warehouse environment and perform Mock conversion testing to validate the business scenarios by pulling the reports through tools like Cognos and Brio.
  • Project Plans and Project Management Office PMO Reporting
  • Prepared developer test cases to verify APIs written for above mentioned data flow architecture.
  • Support of Stress and Volume Testing where applicable to ensure target environments can support target volume
  • Coordinated Shadow environment development and testing activities by working with the infrastructure team Oracle and Teradata DBA, Data Architects, Business analysts and Subject Matter Experts.
  • Managed and supported Hour-by-hour plans for all mock conversions and actual production cutover

Environment: Oracle 11g, Teradata14.0, Oracle SQL Loader, Informatica 9.5.1, Informatica Data Profiler, Informatica Data Quality, SeapineTestTrackClient 2008.1, QTP 11, Data Warehouse, OLAP, SQL Navigator, SQL Developer, XML, OLTP

Confidential, Dallas, TX

Data Modeler/Analyst

Responsibilities:

  • Processed large Excel source data feeds for Global Function Allocations and loaded the CSV files into DB2 Database with.
  • Validate data in development and production environment are in sync.
  • Document functional specifications, conversions, upgrades, interfaces, reports, forms, and workflow.
  • Created the Hem Clinical Data dictionary of the various facilities.
  • Understand Healthcare domain specific to core measure analysis and reporting.
  • Developed mapping documents for ICD 9 to ICD 10 application.
  • Participate in sessions to map for ICD 9 to ICD 10 codes for each conversion
  • Through data analysis and mapping, created specifications to reflect how every field of information was to be converted
  • Analysed all ICD10 codes and validated the outcomes
  • Participated in and/or facilitated iterative mocks internally and externally with clients to validate data, then refine mapping requirements and ensure accuracy and quality
  • Analyzed and validated technical requirement for HIPAA 5010 and gathered technical requirement for ICD10.
  • Worked with ICD-10 Code Translator tool to ensure that all the ICD-9 codes are converted into the ICD-10-CM codes.
  • Attended CMS teleconference on ICD-10 Federal mandate for Providers
  • Created gap analysis about ICD 9 to 10 conversion process and also involved in data mapping for conversion.
  • ICD-9 to ICD-10 Conversion: Performed Impact analysis to determine systems impacted by ICD 9 to ICD-10 Conversion.
  • Review process design and stored procedures. Ensure compliance to standards.
  • Involved in Data Quality and Informatica Data Profiler for profiling.
  • Worked with Informatica Data Quality 8.6.1 IDQ toolkit, Analysis, data cleansing, data matching, data conversion and reporting and monitoring capabilities of IDQ 8.6.1.
  • Was involved in testing OLAP cubes, multiple web based applications across different dimensions which allows user to analyze the core measures data.
  • Expertise in building tables, writing SQL statements/queries, stored procedures and views using company defined best practices for security and efficiency.
  • Created Relational Data models through ERWIN and have set the enterprise modeling standards
  • Developed the complete batch processes module by writing complex DB2 stored procedures, functions and triggers.
  • Developed the best practices and standards for Data Governance Processes.
  • Created source to target mappings for multiple source from SQL server to DB2. This was used by ETL developers.
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
  • Used SQL Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into base tables.
  • Extensively wrote SQL Queries Sub queries, correlated sub queries and Join conditions for Data Accuracy, Data Analysis and Data Extraction needs.
  • Developed the E-R Diagrams for the logical Database Model, created the physical Data Model with Erwin data modeler.
  • SQL process/results evaluations and/or problem investigation
  • Responsible for routine and/or scheduled manual database import/update processes from external sources to support comparative reports

Environment: DB2 on z/OS, Erwin Data Modeler r7.3, Erwin Model Navigator r7.3, Informatica Data Profiler, Informatica Data Quality, SeapineTestTrackClient 2008.1, QTP 11, Data Warehouse, OLAP, SQL Navigator, SQL Developer, Erwin 4.0, XML, OLTP

Confidential

Data Modeler/Data Analyst

Responsibilities:

  • Involved in gathering business requirements by conducting a series of meetings and brain storming sessions with the business users.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new Features, and enhancements to existing technical business functionality.
  • Involved in identifying the process flow, the work flow and data flows of the core systems.
  • Worked extensively on user requirements gathering and gap analysis.
  • Created Conceptual, Logical Modeling and Physical Database design for OLTP and OLAP systems.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
  • Monitored the Data quality of the daily processes and ensure integrity of data was maintained to ensure effective functioning of the department.
  • Conducted logical data model walkthroughs and validation.
  • Worked with DBA's to create a best fit physical data model from the logical data model.
  • Implemented Referential Integrity using primary key and foreign key relationships.
  • Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using the reverse engineering ER/Studio tool.
  • Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models.
  • Created Design Fact & Dimensions Tables, Conceptual, Physical and Logical Data Models using ER/Studio.
  • Involved in designing the integrated Data Warehouse for the company to store the data from the OLTP systems.
  • Involved in implementing and testing of the developed application data standards.
  • Involved in Data analysis for data conversion - included data mapping from source to target database schemas, specification and writing data extract scripts/programming of data conversion, in test and production environments.
  • Data Warehouse - Designed and programmed ETL and aggregation of data in target database, working with staging, de-normalized and start schemas and dimensional reporting.

Environment: SQL Server 2000, Oracle 10g, ER/Studio, PL/SQL, UNIX, Windows NT.

Sr Data Analyst

Confidential

Responsibilities:

  • Participates in all phases of the programmatic data repository development life cycle, with emphasis on design, development/programming, documentation, testing and implementation.
  • Identify data anomalies for identified data sources and review results with users and data stewards
  • Created source to target mappings with transformation rules
  • Wrote SQL Queries for Data Accuracy, Data Analysis and Data Extraction needs.
  • Created requirements for data modelers like minimum, maximum values, acceptable range, null acceptance etc.
  • Design and prepare the Logical/Physical data models for Global Table Portal using the CA Erwin data Modeler as per the special needs of the SSA Agency.
  • Provided the support for automated and manual procedures for metadata creation, retrieval from the Enterprise data Repository and delivered to the Application Teams.
  • Design and prepared the Physical data models using CA Erwin data Modeler for supporting the Relational data base systems and customizing the Repository Meta models as per the special needs of the SSA Agency
  • Provide support to streamline application development, data dictionary, data maps, data artifacts and assists with coordinating the data standards across the development teams
  • Design and develop backup/recovery Strategy for the metadata repository.
  • Responsible with ETL design identifying the source systems, designing source to target relationships, data cleansing, data quality, creating source specifications, ETL design documents, ETL development following Velocity best practices.
  • Analysed the Use cases and developed the Test cases and helped in the preparation of master test plan.
  • Analyze the current data movement ETL process and procedures. Identify and assess external data sources as well as internal and external data interfaces.
  • Provided the support for Development of data requirements for the Medicare and disability applications and worked with latest release of enclosed Programmatic Systems. Medicare Part A/B and IRMAA PT2T18

Environment: Erwin Data Modeler r7.3, Erwin Model Navigator r7.3, CA Unicenter platinum tools, Unix Shell, CA AllFusion Meta data Repository, COBOL, JCL, DB2 on z/OS, ENDEVOR, CONTROL-M, UTILITIES LIKE SPUFI, SQL, QMF, HP ALM 11.0 QTP 10.0, SQL Server 2008, Sybase 11, Oracle 11g, Windows 7/XP/2003.

Confidential

Data Analyst

Responsibilities:

  • Involved in designing physical and logical data model using ERwin Data modeling tool.
  • Designed the relational data model for operational data store and staging areas, Designed Dimension & Fact tables for data marts.
  • Extensively used ERwin data modeler to design Logical/Physical Data Models, relational database design.
  • Created Stored Procedures, Database Triggers, Functions and Packages to manipulate the database and to apply the business logic according to the user's specifications.
  • Created Triggers, Views, Synonyms and Roles to maintain integrity plan and database security.
  • Creation of database links to connect to the other server and Access the required info.
  • Integrity constraints, database triggers and indexes were planned and created to maintain data integrity and to facilitate better performance.
  • Used Advanced Querying for exchanging messages and communicating between different modules.
  • System analysis and design for enhancements Testing Forms, Reports and User Interaction.
  • Environment: Oracle 9i, SQL* Plus, PL/SQL, ERwin, TOAD, Stored Procedures.

We'd love your feedback!