We provide IT Staff Augmentation Services!

Data Architect Resume

3.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Around 10 years of extensive experience in the complete Software Development Life Cycle (SDLC) covering Requirements Management, Data Analysis, Data Modeling, System Analysis, Architecture and Design, Development, Testing and Deployment of business applications
  • Strong Data Modeling experience using ER diagram, Dimensional/Hierarchical data modeling, Star Schema modeling, Snow - flake modeling using tools like Erwin, ERStudio
  • Strong Informatica ETL mapping design skills and ETL development using Informatica Powercenter
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs’ to apply the data model changes
  • Extensive experience in Data Conversion in ACH Transactions
  • Extensive experience in gathering business requirements, implementing business processes, identifying risks, impact analysis, UML modeling, sequence and activity diagrams and using Rational Rose
  • Utilized RUP (Rational Unified Process) to configure and develop processes, standards, and procedures
  • Possess strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries
  • Created/maintained/modified data base design document with detailed description of logical entities and physical tables
  • Strong grasp of Third Normal Form(3NF) data modeling
  • Implemented Data Warehousing projects and Very strong expertise in Dimensional Data Modeling
  • Strong knowledge of data modeling patterns and industry standard representations
  • Extensive experience with Oracle OBIEE business intelligence solutions.
  • Excellent knowledge of waterfall, spiral and Agile methodologies of Software Development Life Cycle (SDLC)
  • Expertise in Software Development Life Cycle Process (SDLC), Use Cases and Rational Unified Process (RUP)
  • Thorough understanding of Big Data concepts and NoSQL databases
  • Experience working in fast paced Agile environments
  • Well versed with Sql Server, Netezza, Teradata, DB2, Oracle and NoSql databases
  • Possess strong documentation skill and knowledge sharing among team, conducted data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility
  • Extensive Experience working with business users/SMEs as well as senior management
  • Experience in coordinating with offshore on development / maintenance projects
  • Strong understanding of the principles of Data warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling
  • Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes
  • Possess a strong analytical, verbal, inter-personal skill that helps in communicating with developers, team members

TECHNICAL SKILLS

Data Modeling Tools: IBM InfoSphere Data Architect 7.6.0, ER/Studio 2016, 9.7/9.5/8.5, Erwin r 9.7/7.5.8/4.1.4 , Power Designer 12.1, Rational Rose, and Visio 2013

Database Systems: Tera Data 12, Oracle (11g/10g/9i/8i/7.x), SQL Server (2014/2008/2005/2000/7.0 ), Microsoft Access 2016/2010/2007 , DB2, MySQL, Oracle Primavera Unifier, OAKS CI, Cloud, ArcGIS Desktop 10, Hadoop, MongoDB, Cassandra

Data Warehousing: Informatica Power Center 9.1/8.1/8.0/7.1/7.0/6.2/6.1/5.2 , Informatica PowerMart 4.7, PowerConnect, Power Exchange, Data Profiling, Data Cleansing, Netezza, SAP HANA, OLAP, OLTP, SQL*Plus, SQL Developer, Composite Software 6.2.0, Toad for Oracle Xpert 12.0.0.61, IBM Data Studio, OBIEE, SAP BO, BizTalk, Tableau 10, Web Services, Java, JDK, Intellij IDEA, Java Angular JSP, QUARTZ Job Scheduler, VERSIONONE enterprise

Software Engineering: UML using Visio

Office Applications: MS Office Professional Plus 2016/2013/2010/2007 (Word, Excel, PowerPoint)

Configuration Management: Clear case, Visual Source Safe and Concurrent Version Control, Remedy

Operating Systems: Windows 95/98/2000/XP, Solaris, Linux, AIX

Quality Assurance: Business and Software Process and Procedure Definition, Quality Models and standards (ISO CMM, CMMI, TQM Principles, Six Sigma concepts), Quality tools (Ishikawa diagram, Pareto analysis, histogram Process), Measures and Metrics, Project Reviews, Audits and Assessments

Testing: QTP, Quality Center

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Data Architect

Responsibilities:

  • Worked on Data Architecture frame work and provided Data Flow Diagram
  • Created Conceptual, Logical and Physical Data models to support Data mart using Erwin r9.7 CASE tool
  • Worked on Agile environment and conducted working sessions to review the Logical and Physical Data models to support all the Business and Data requirements
  • Created source to target mapping document and captured complete Data lineage
  • Created, maintained and delivered change log document to track all the weekly database deployment changes
  • Analyzed and documented source systems like Cloud based OAKS CI and NoSQL database MongoDB
  • Conducted weekly Project Manager’s group meeting to discuss and improve the best practices and Data standards
  • Involved in MDR (Meta Data Repository) development initiative to capture all the data lineage
  • Enforced Confidential Data Model best practices and Data standards across enterprise
  • Designed structures specifically to enable fast querying for business-centric reporting
  • Ensured that business requirements are met, and reports are accurate and meaningful
  • Documented source and target systems correctly to aid development, ensure effective version control, and enhance understanding of the systems
  • Delivered detailed High (HLD) and Low Level Design Document (LLD)
  • Delivered Conceptual, Logical, and Physical data models to support Data mart
  • Used Dimensional modelling techniques and best practices and created Strat schema structure to support all Confidential reporting needs through interactive data visualizationtools like Tableau
  • Generated DDL and deployed in Dev and Test environments
  • Created, maintained, documented and archived all enterprise data models. And created robust, error-free model artifacts DDL, DTD, XML and Data Dictionary as required to support the deployment of data structures

Environment: MySQL, MongoDB, Oracle Primavera Unifier, OAKS CI, Cloud, Access 2016, Erwin r9.7, DbVisualizer 9.5.6, Tableau 10, Web Services, Java, JDK, Intellij IDEA, Java Angular JSP, QUARTZ Job Scheduler, VERSIONONE enterprise, Windows 7 enterprise, XML, Microsoft office 2016, Excel, Access, Visio 2013, Snipping Tool, Notepad++, PDF creator, Adobe Acrobat XI Pro, Skype for business 2016.

Confidential, Hilliard, OH

Data Architect/Information Architect

Responsibilities:

  • Worked on Data Architecture frame work for RITA CMS MX Booking, deal info structures, Payments, Servicing, and End of term modules.
  • Reverse engineered existing data structures and analyzed the source systems like EDB, MX and BR Originations
  • Created Conceptual, Logical and Physical Data models to support Contract Management System (RITA CMS) using ER/Studio 2016 CASE tool
  • Worked on Agile environment and conducted working sessions to review the Logical and Physical Data models to support all the Business and Data requirements
  • Developed and executed data population scripts for Deal Info structure tables like product, product property and contract product map etc. to support Dev, Test environments
  • Created source to target mapping document including contract and product
  • Created, maintained and delivered change log document to track all the weekly database deployment changes
  • Thorough understanding of Big Data concepts and NoSQL databases
  • Conducted weekly Data Architect’s group meeting to discuss and improve the best practices and Data standards
  • Involved in MDR (Meta Data Repository) development initiative to capture all the data lineage
  • Enforced Confidential Data Model best practices and Data standards across enterprise
  • Developed and executed the macros ‘ Confidential Physical Names’
  • Delivered detailed High (HLD) and Low Level Design Document (LLD)
  • Delivered Conceptual, Logical, and Physical data models.
  • Generated DDL and deployed in Dev and Test environments
  • Checked in DDL scripts in to TFS (Team Foundation Server) as part of the deployment process
  • Created, maintained, documented and archived all enterprise data models. And created robust, error-free model artifacts DDL, DTD, XML and Data Dictionary as required to support the deployment of data structures

Environment: SQL Server 2014 management studio, Microsoft SQL Server 2008 R2, ER studio 2016/9.7, SQL Developer, Microsoft Visual studio 2013, Toad for Oracle 12, Windows 7 enterprise, XML, Excel, Access, Visio, Request Center(RC) BMC, .Net, Java, JDK, TFS explorer, Informatica 9.1.0, SSIS, SSRS, OBIEE, Qlick view, Siebel, SAP, BizTalk, Snipping Tool, Notepad++, PDF creator, Adobe.

Confidential, Orlando, FL

Data Architect/ Data Modeler

Responsibilities:

  • Worked on Inventory Setup, Inventory Distribution and Inventory Offering Assignment work packages
  • Created Conceptual, Logical and Physical Data models to support View and Manage Availability (VMA) work packages using ER/Studio 9.7 CASE tool
  • Analyzed the reporting requirements for Inventory Offering Mapping, Allocation Report, Unassigned Counts and Total Number of points in Repository
  • Analyzed source systems (SOR) Physical Inventory (SQL Server), VStage, CR, Property Interface and Allocation Inventory (AI)
  • Designed reporting database to provide operational reporting capability
  • Involved in analysis and designing of Product Hub Master Data management (MDM) database
  • Incremental functionality added to the existing functionality
  • Generated DDL and closely worked with DBA team to deploy Data models
  • Referred WVO Data Model Patterns Document to meet WVO best practices and Data standards.
  • Developed and used Token list to translate the names from Logical to Physical models. Token list provides the acceptable abbreviation for any friendly name. And followed Tokenization and Lineage Tagging Standards process by executing the macro ‘WVO Tokenize Physical Names’
  • Delivered detailed High (HLD) and Low Level Design Document (LLD)
  • Created, maintained, documented and archived all enterprise data models. And created robust, error-free model artifacts DDL, DTD, XML, and CWM as required to support the deployment of data structures

Environment: Oracle 11i, DB2, Main Frames, SQL Server, ER studio 9.7, Oracle SQL Developer, Windows 7, XML, Excel, Access, Visio, Toad for Oracle 12, Request Center(RC) Assyst, .Net, Java, JDK, Toad Data point 3.3, Oracle Express - 11g, Oracle Client - 11g, TFS explorer, Microsoft Visual studio 2013, Informatica 9.1.0, OBIEE, Snipping Tool, IBM Data Studio 2.2, WinSCP 4.3.2, blueprint 6.1,MDM, Jenkins, Notepad++, PDF creator, Adobe, APEX

Confidential, Jacksonville, FL

Data Architect

Responsibilities:

  • Enhanced and leveraged Confidential 's investment in corporate data resources by providing data analysis support, modeling support and coordination of corporate data resources
  • Implemented and enforced data definitions, standards and procedures relating to Logical and Physical design
  • Administer data design efforts and assists software engineers in data source discovery, data modeling, and metadata capture and normalization techniques
  • Worked on the Service Requests submitted by the Application teams
  • Managed the incoming Service Requests and maintenance workload using the Service Management and Request Tool
  • Incrementally developed conceptual, logical and physical data models for new database objects using ER/Studio 9.7 CASE tool
  • Maintained existing database objects, from software engineer provided requirements or data definition language(DDL) using ER Studio Computer Aided Software Engineering(CASE) tool
  • Lead database design sessions and capture data requirements for software engineering and Re-engineering projects
  • Collaborated with data base administrators on storage requirements, performance strategy, primary and foreign key constraints, placement of tables within table spaces, indexing strategy, and DDL implementation
  • Ensured the integrity of all Confidential production database through a structured change control process
  • Advocated for Data Management principles and capabilities within the organization
  • Educated application teams to enforce Confidential data standards by providing Confidential Data standard documents and related awareness through email communications and pair programming
  • Provided better understanding on Confidential standard SR work flow to achieve Quality results
  • Involved in design flow of testing and new enhancements of SR application process and recommended new guidelines in the process flow
  • Implemented and currently working on new features in SR application process for future releases
  • Extended help to resolve Emergency Production SR’s during and after hours to meet application team’s timelines
  • Accomplished 95% of SR’s with in the SLA time frame. 7. 5% deferred SR’s due to lack of clear instructions, missing required information from the Requester which leads to rework. But yet decreased ‘Returned’ SR’s percentage after communicating with the Requester about the incorrect data provided
  • Promoted ‘Demand’ process to application teams by providing clear differences between SR and Demand services
  • Provided Bi weekly SR status reports to management to analyze and make better possible decisions to improve the services
  • Received appreciation for lasting contribution especially on Emergency SR's, ‘Best service and a Friendly DA’ from various requesters
  • Miscellaneous activities and responsibilities as assigned by manager

Environment: Oracle 11i, DB2, Main Frames, SQL Server, ER studio 9.7/9.5, Windows 7, XML, Excel, Access, Visio, Toad for Oracle 12, Request Center(RC), .Net, Java, Toad Data point 3.3, Snipping Tool, IBM Data Studio 2.2, ArcGIS Desktop 10, WinSCP 4.3.2, Notepad++, PDF creator, Adobe, APEX

Confidential, Houston, TX

Data Architect

Responsibilities:

  • Worked on multiple data domains and ensured consistency in the maintenance and usage of the data
  • Worked on Data Architecture plans and front-end loaded development activities like requirements and high-level design
  • Have an end-to-end vision, and able to translate logical design into one or more Physical Data bases and the data will flow through successive Stages Involved
  • Designed Project Data Architecture approach and strategy (ETL, BI, etc.)
  • Ensured alignment with CNAP and MCBU organization’s standards
  • Addressed issues of Data Migration, Validation, clean-up and mapping
  • Gathered and Analyzed Data Requirements by consulting Stake holders and Business Users
  • Created Data Flow Diagrams to understand the relationships and interrelationships of systems and subsystems, modules, components
  • Analyzed current state Architecture and created target state Architecture
  • Performed Data Profiling, Gap Analysis (SOR including hierarchy sync) in Composite stored procedures and Views
  • Created centralized artifacts for Organization and Project
  • Analyzed Source System Data, and Created and maintained up-to-date Source to Target mapping document with clear transformation rules
  • Validated and reconciled Source to target data discrepancies
  • Created technical ETL design documents
  • Created Data model(ERD) for the project and maintained an integrated CIM model
  • Leveraged Upstream Architecture when designing specific projects and solutions

Environment: SQL Server, ORACLE, Main Frames, COBOL, DB2,VSAM, Microsoft SQL Server 2008 R2,SAP,.Net, ERwin 7.3.8, DataStage, Informatica Powercenter 8.1,Toad for ORACLE 11.5,Toad for Data Analysts 3.1,Composite Software 6.2.0,Global Information Link, Unified Project Architecture Process(UPAP), Confidential Project Development & Execution Process(CPDEP),Windows 7 Enterprise, XML, Microsoft Lync 2010,Microsoft Office 2010, OBIEE, Microsoft Share Point Workspace 2010, Access, Snipping Tool, Visio, Wellnomics WorkPace 4.2.3

Confidential, Columbus, OH

Data Modeler

Responsibilities:

  • Involved in Source code scanning and analysis
  • Derived logical data model from source systems
  • Obtained DDL of Production Data Stores
  • Mapped physical to logical definitions
  • Worked on Extended Transactions History (ETH), Account Transactions History (ATH) and Real Time Account Maintenance (RTAM) domains
  • Created logical data model with physical mapping for ATH, ETH and RTAM subject areas
  • Standardize naming conventions to better reflect business needs
  • Attended review sessions for Logical Data Model Review Account processing and domain review combined Function Data & Service Modeling and Data modeling for CDS Transactions
  • Worked on gap analysis and Compared between CDS Transactions columns to the existing IBM IFW Industry Data Model
  • Prepared User Guide for IDA data model tool
  • Created the workspace in Rational Team Concert (RTC) and worked on Check in, check out & delivering, accepting the changes
  • Worked on BOM level in RTC for ETH: Merged the ETH and ATH Logical model and generated the Physical Data Model
  • Worked on Business Objective Model (BOM-IBM IFW Industry model) level for ATH: Created new classes, added attributes to the entities and customized the new fields
  • Worked under PDM Level for ATH: Reverse Engineer the DDL and created the PDM
  • Added abbreviations, relationships to the entities and delivered in to Rational Team Concert (RTC)
  • Analyzed the Interface Data Mapping document for Account Open / Maintenance / Inquiry - DDA and mapped the fields from the Legacy systems to the Target Logical (BOM) and created the consolidated Mapping document for DDA
  • Worked on the common fields in both Account DDA and CDS and Mapped to the target data base

Environment: Main Frames, COBOL, DB2, VSAM, ORACLE, Java, IBM InfoSphere Data Architect 7.6.0, Rational Team Concert, Informatica Power center 8.1, Netezza, OBIEE, Windows, XP, XML, Excel, Access, Visio

Confidential, Dallas, TX

Data Modeler

Responsibilities:

  • Analyzed the DW project database requirements from the users in terms of the tables which will be most useful
  • Analyzed the requirements for developing Conceptual model
  • Defined relationships, cardinalities among entities
  • Created and maintained Database Objects (Tables, Views, Indexes, Sequences, Database triggers, stored procedures etc.)
  • Analyzed existing Logical Data Model (LDM) and made appropriate changes to make it compatible with business logic
  • Involved in data model reviews with internal data architect, business analysts, and business users with in depth explanation of the data model to make sure it is in line with business requirements
  • Worked on analyzing source systems and their connectivity
  • Worked with cross-functional teams and prepared detailed design documents for production phase of current customer database application

Environment: SQL Server 2000/ 2005, Teradata 12, Oracle 9i, DB2, Main Frames, Informatica Power center 8.1, Erwin 7.5.8, Windows 7, XML, Excel, Access, Visio

Confidential, Phoenix, AZ

Data Architect/Data Modeler

Responsibilities:

  • Updated and captured latest Metadata and Reconciled Existing Data Dictionaries for Organization, Integrity Checks and Relationship Accuracy
  • Managed Data Content by researching existing Data Flow Diagrams, Data Stores, Data Models, Data Qualification and Standardization rules, and Systems Documentation
  • Developed Confidential Master lists and Data Dictionaries of Pharmacy (Rx), 837 Institutional (Hx) & 837 Professional (Mx) Business Elements with tightly coupled derivative fields and grouped them logically within the Standard Data Context within the organization
  • Reverse-Engineered and Linked the Business Element Names from the various Master lists back to the (expected) Physical Instance Names in multiple Data Stores
  • Captured and highlighted key elements for Core Business Functions such as Practitioner Matching, De-Duplication, and Qualification; referenced Data Quality Constraints and rules for key elements wherever appropriate
  • Interviewed Technical Staff with regard to the Data Content, Data Standards, Data Process, Data Services and Data Rules
  • Researched existing data models, data structures, table content (using Toad) excel mapping files, and various data sources for dimensional and transaction data to build the metadata mapping in E/R Studio
  • Utilized E/R Studio to create comprehensive mapping documents and data lineage
  • Exported new or existing Data Mappings and Database Schema changes from Logical Data Models (LDMs) in E/R Studio into Physical Data Models (PDMs) and worked closely with fellow DAs and DBAs to translate and implement these models
  • Conducted Structured Review sessions to reconcile model updates and related work with Senior Data Architects and Data Governance/Data Architecture Manager
  • Organized and Published Detailed Data Content on SharePoint Knowledge Base following Peer-Review, Consensus and Sign-off
  • Published the mapping from E/R Studio in a form for non-technical viewing such as in JPEG, PDF formats

Environment: Main Frames, COBOL, DB2, VSAM, ORACLE Exadata, ER Studio 8.5, Windows XP, XML, Excel, Access, Visio

Confidential, Richmond, VA

Data Modeler/Data Analyst

Responsibilities:

  • Designed, Developed and maintained web based platform for the E-Rebate System. It Includes the modules such as Rebate Contract Management, Data Management, Rebate Processing, Rebate Post Processing, Rebate Reconciliation, Rebate Reporting and contract Modeling
  • Rebate Contract Managements the module that manages the rebate contracts dealing with the pharmaceutical manufacturer as well as the client for sharing purposes. These modules are essential to the identical of how much to bill and when, as well as how much to share with the client. The information housed in this module has a direct correlation to the efficiency of the other modules
  • Rebate Contract Managements includes the components such as Client Global, Manufacturer Global, Client Contract and manufacturer Contract
  • Global level module list all the clients and manufacturer details
  • Client Contract is a legal arrangement between PBM and the client for processing the claims, pharmacy benefit design an as well as rates
  • Manufacturer Contract is a legal arrangement between a pharmacy (manufacturer) company and a PBM, health plan, state or other entity where rebate reimbursement arrangements are defined for a drug or multiple drugs
  • Participated in JAD session with business users and sponsors to understand and document the business requirements in alignment to the financial goals of the company
  • Created the conceptual model for the data warehouse using Erwin data modeling tool
  • Reviewed the conceptual EDW (Enterprise Data Warehouse) data model with business users, App Dev and Information architects to make sure all the requirements are fully covered
  • Analyzed large number of COBOL copybooks from multiple mainframe sources (16) to understand existing constraints, relationships and business rules from the legacy data
  • Worked on rationalizing the requirements across multiple product lines
  • Reviewed and implemented the naming standards for the entities, attributes, alternate keys, and primary keys for the logical model
  • Implemented Agile Methodology for building an internal application
  • Reviewed the logical model with application developers, ETL Team, DBAs and testing team to provide information about the data model and business requirements
  • Worked with ETL to create source to target mappings (S2T)
  • Worked with DBA to create the physical model and tables
  • Worked on Mercury Quality Center to track the defect logged against the logical and physical model
  • Had brain storming sessions with application developers and DBAs to discuss about various demoralization, partitioning and indexing schemes for physical model
  • Worked on Requirements Traceability Matrix to trace the business requirements back to logical model

Environment: - Erwin, Quest Central for DB2 v 4.8, COBOL copybooks, Mainframe DB2, SQL Server 2000, Oracle 11g SQL*Loader, Mercury Quality Center 9, Informatica Power Center 8.1

Confidential, West New York, NJ

Data Modeler

Responsibilities:

  • Involved in Source code scanning and analysis
  • Derived logical data model from source systems
  • Obtained DDL of Production Data Stores
  • Mapped physical to logical definitions
  • Created logical data model with physical mapping for Financial Instrument subject area
  • Abstract EDM from physical data model
  • Standardize naming conventions to better reflect business needs, e.g. Address, etc.
  • Assigned primary keys to all tables; evaluated use of multiple dates in the primary keys
  • Evaluated use of ‘Blank’ and ‘Null’ as valid attribute values in entity definitions
  • Eliminated repetitive attributes (e.g. numbered codes in tables should go to a separate table), levels and type codes, (e.g. simplify product table to make it more manageable through normalization; eliminate redundancies across databases, price and classification)
  • Reduced total number of tables (e.g. avoid single attribute tables, many Issue tables)
  • Developed logical models for Transaction - Events affecting client holdings or accounts Client Account - Holding vehicle for assets, Products & Services - Firm service or offering, e.g. check writing, debit cards, etc. Party - Individual or organization of interest to Confidential WMA Position - The intersection of a client account with a financial instrument Financial Instrument - Reference data & pricing for WMA instruments Corporate Action - Any event that brings material change to a financial instrument Operational Position Keeping - Real-time positions and balances General Ledger - Accounting books of business for Confidential WMA
  • Defined EDM for Subject Areas such as Financial Instrument, Party, Client/Account and Operational Position Keeping
  • Defined mappings from EDM back to physical for the following subject areas:
  • Financial Instrument to PDB, FPM, KAR
  • Party to CRDB
  • Client/Account to CRDB
  • Operational Position Keeping to RPB
  • Exported EDM to Virtualization-friendly format
  • Provided a consistent, structured view of the data for front-end applications

Environment: Main Frames, COBOL, DB2, VSAM, ORACLE, Informatica Powercenter 8.1, OBIEE, Erwin 7.5.3, Windows XP, XML, Excel, Access, Visio.

Confidential, St. Louis, MO

Relational Data modeler

Responsibilities:

  • Involved in gathering business requirements and translated business requirements
  • Actively participated inJAD sessionswith the subject matter expert, stake holders and other management team in the finalization of User Requirement Documentation
  • Experience in identifying entities required and the relation between the entities to create Conceptual logical model
  • Efficiently developed theData model and ERD diagrams using Power designer
  • Created and maintainedLogical Data Model (LDM)for the project which includes documentation of allentities, attributes, data relationships, primary and foreign keystructures, allowed values, codes, business rules, glossary terms, etc.
  • Validated and updated the appropriateLDM'sto process mappings, screen designs,use cases, business object model, and system object model as they evolve and change
  • Creating of theXML data modelfor theExpress PA projectfor the Rules engine and providing the model in .XSD formatfor the services team
  • Creating themapping documentsto the find out the gaps between the mockup screens, services and database
  • Identifying thegaps and fills those gapsin the database by communicating with DBA
  • Enhancing the existing data model byreverse engineeringby adding new tables based on the business requirements which will enhance in web services
  • Identifying the required fields for the Drug and Coverage check web services in the database and understanding initial data loading Process
  • Creatinglogical and physical model databaseandschema in oracle
  • Reverse engineeringexistingdatabase to conductsourcedata analysistoidentify the relationshipbetween data sources
  • Schema Replication for development purpose using Power designer
  • Normalizing theData and Developing the Relational databasefor the differentweb services and database
  • Creating of the database scripts using Power designer and submitting to the DBA

Environment: Oracle 10g, Power designer, Oracle SQL developer, OBIEE, SQL plus, Visio, MS office, CVS, Putty, Soap UI.

Confidential, Richmond, VA

Data Modeler

Responsibilities:

  • Interacted with business users to analyze the business process and requirements and transforming requirements into Conceptual, logical and Physical Data Models, designing database, documenting and rolling out the deliverables
  • Conducted analysis and profiling of potential data sources, upon high level determination of sources during initial scoping by the project team
  • Coordinated data profiling/data mapping with business subject matter experts, data stewards, data architects, ETL developers, and data modelers
  • Worked on Bank Data ware house and Mortgage Data ware house
  • Developed logical/ physical data models using Erwin tool across the subject areas based on the specifications and established referential integrity of the system
  • Normalized the database to put them into the 3NF of the data warehouse
  • Involved in dimensional modeling, identifying the Facts and Dimensions
  • Maintained and enhanced data model with changes and furnish with definitions, notes, reference values and check lists
  • Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments. Generated DDL scripts for database modification, Teradata Macros, Views and set tables
  • An enthusiastic and project-oriented team player with solid communication and leadership skills and the ability to develop different solutions for challenging client needs

Environment: Teradata, SQL Server 2000/ 2005, Informatica Powercenter, Erwin 7.5.2, Netezza, Windows XP, XML, Excel, Access, Visio.

Confidential, Charlotte, NC

Data Modeler/Data Analyst

Responsibilities:

  • Building Data Mart (Conceptual, Logical and Physical) for reporting environment
  • Creating Data Dictionary for user defined databases (Collateral, Securities and Reporting) in Oracle, SQL Server
  • Worked on Data Conversion for ACH Transactions
  • In the role of Data Analyst performed analysis and design of extensions to an existing data warehouse/mart business intelligence platform
  • Defined enterprise data architecture vision, strategy, principles and standards; get buy-in from stake-holders, management, business partners, and propagate throughout the company
  • Segregated data and organized the data for common subjects using ODS
  • Performed Data Architecture in Designing and implementing a Metadata Repository providing a centralized information source for the data models, data maps, processes, documents, contact lists, project calendars and issues affecting the merger with Wachovia
  • Implemented Agile Methodology for building an internal application
  • Worked on Data Aggregation
  • Generated Discrepancy reports using Ascential Data Stage
  • Prepared Functional Specifications Document for the project
  • Technical hands-on expertise using OLAP tool Business Objects 4/5.x/6.0 /XIR2(Reports, Designer, Web Intelligence, Info view, and Supervisor)
  • Universe designing, developing and generating complex, Adhoc, dashboard Reports
  • Analyzed data in both application and reporting databases and solving the discrepancies
  • Tracked Data Model Change Requests to closure
  • Reverse engineering old database and created subject areas for each schema
  • Prepared High Level Data Flow for all the major Applications
  • Well experienced in writing Complex queries, stored procedures, functions, cursors and packages using PL/SQL Programming
  • Developed robust and efficient oracle PL/SQL procedures, packages and functions that were useful for day to day data analysis
  • Fine tuned existing PL/SQL code and worked on upgrade from 9i to 10g
  • Fine tuned Oracle SQL to be more efficient

Environment: Oracle 9i/10g, SQL Server 2000/ 2005, Erwin 7.5.2, Power Designer 12.1, BO 6.5/ XIR2, Netezza, Qlikview 8.5, Ascential Data Stage 7.5, Clear Quest, Windows XP, XML, Excel, Access.

Confidential

Systems Analyst

Responsibilities:

  • Responsible for accuracy of the data collected and stored in the corporate support system
  • Worked on analyzing source systems and their connectivity
  • Performed data review, evaluate, design, implement and maintain company database
  • Involved in construction of data flow diagrams and documentation of the processes
  • Interacted with end users for requirements study and analysis by JAD (Joint Application Development)
  • Performed gap analysis between the present data warehouse to the future data warehouse being developed and identified data gaps and data quality issues and suggested potential solutions
  • Participated in system and use case modeling like activity and use case diagrams
  • Analyzed user requirements and worked with the data modelers to identify entities and relationship for data modeling
  • Actively participated in the design of data model like conceptual, logical models using Erwin

Environment: Erwin 4.x, Oracle 9.x, Toad, and MS Excel and Access

We'd love your feedback!