- 20 years experience as an IT professional and is considered a senior Level V Data and Information Architect with strong Conception and Logical Data Modeling, Physical Data Base Design and Models and Data Base Administration skills. Ms Sterling has extensive back-end development expertise for all phases of the SDLC Software Development Life Cycle with significant experience in repeatable end-to-to-end design, building and tuning. This includes implementing VLDBs Very Large Data Bases , with 24 7 availability, high transaction volumes for OLTP online transaction processing and data warehouses. She possesses experience in translating business requirements int conceptual, logical and physical data models across multi-application environments. Recently, over the past four years her project experience includes defining Source Systems t Target System analysis and mappings t create the staging and integration layers for data marts, operational data stores and as well as dimensional data warehouses for business intelligence users. Data cleansing activities include identifying duplicate and redundant data as part of the conversion process.
- Ms Sterling is an Information Architect and applies both top-down and bottom-up approaches modeling guidelines for system and enterprise-wide application solutions. This includes identifying opportunities for sharing and reuse of information across the enterprise regardless of the original source of the data internally developed systems as well as 3rd party software .
- A Project Manager and/or an individual contributor, she has solid experience in JAD and Agile facilitation, Structured Methodologies, Project Scheduling/Implementation, Object Oriented Design, Software Configuration Management and CRM Roll-out expertise. Possess a thorough working knowledge of client/server, mainframe n-tier architectures. Has experience in multi-platform environments that includes IBM Mainframe distributed system, Sybase, UDB, Oracle, SQL Server, Informix as well as back-end related development experience ETL processing, RI triggers, stored procedures, data conversion, data movement, and information management. Has experience in performing enterprise-wide strategic systems planning, business information planning, and business analysis. Sherry has strong interpersonal, written and verbal communication skills and demonstrated expertise as a top-level contributor and specialist.
IBM s/370, s/390, RS/6000 SP, AIX 4.3, 30xx, 90xx, 43xx, 37xx, 3174, PC-XT, PS2, Sun Solaris, DB2 Universal Database Extended Enterprise Edition UDB EEE version 5/6.2/7,1, 8 , PVCS version Management, VI Editor Command Line, DOS, MVS, VM/CMS, TSO/SPF/PDF, IMS/DB, CICS, DB2 MVS, SYBASE 11.5/12.5, MS SQL Server, IDMS, RAMIS, FOCUS, JCL, LIBRARIAN, SAS PANVALET, GML, HTML, SCRIPTS, PSL, EXCELERATOR, IEF, EFW, ADW, COBOLII, PL/1, IBM ASSEMBLER, FORTRAN, BASIC , SQL, INFORMIX, Visual Basic, LOTUS Notes, WINDOWS /NT Microsoft OFFICE, MS PROJECT, Adobe, LBMS/SELECT System Engineer, Object Manager, Process Engineer, Erwin 3.5.2/4.1, 7.1, 7.2, 7.3 BPWin 7x Bachman, System Architect, PC PRISM, ToolBus, Cayenne, Cool DBA, PowerDesigner 12.0, DBArtisan , E/R Studi 6.5/7.0, 7.5, 8.0 E/R Studi Repository , Oracle 8i, Oracle 9i, Oracle 10g, TOAD for Oracle 9.7, Altova XML 2008, oxygen, Composite Software, IBM IAA Data Model, DoDAF
- Business Intelligence Data Warehouse Architect HB New York, NY 04/2009 - present Data Architect for BI programs focusing primarily on reporting analytics and scorecards. HB plans t offer new web-based services via the Internet on a national basis this year. HB G SM is a new service delivered through its television Affiliates t consumers already subscribing t HBO. It provides instant, unlimited access t HB online, anytime, anywhere. Includes streaming videos of HB Original Series, blockbuster movies, sports, comedy and documentaries. Responsibilities included: Writing the functional and non-functional analytical reporting requirements for the Web HB G SM HoBB 2.0 project. Designing, modeling and creating a Data Warehouse specifically for the Marketing, Product and Web Analytical teams. This warehouse provides a clear set of facts and dimension tables t assist with metrics used t analyze the performance of HB GOSM Complexities included monitoring churn activity and calculating long tail frequency distribution. The warehouse design was based on anticipating extremely large volumes quickly. Capacity planning, optimization and physical data store organization expertise skills were used as this project had t be ready for Phase 2, when similar services will be deployed for the Cinemax G SM offering. Documenting the movement of data between systems, the rules and transaformation logic using the Data Lineage capabilities in E/R Studio. These source t target ETL Specification mappings were created for incorporation int the INFORMATICA 8.5.1 Power Center toolset. Creating and maintaining separate data model and DDL versions for the development, alpha, beta and production environments, table loads, data dictionary creation and capacity planning for a VLDB Oracle 11g . Information Architecture/Data Architect L-3 Communications/Army CIO/G6 Ft. Monmouth, NJ 03/2007 1/2009 An accomplished Data Modeler/Architect hired t provide support t Army customers with functional and data modeling activities, data engineering analysis and architecture product development needs. All phases of data architecture including data organization, data storage, data access, and data movement were involved. Responsibilities included: Dissected complex problems, and formulated timely, practical solutions, which explored, evaluated, and adapted new technologies t current solutions. Developed logical and physical data models t support interoperability in a net-centric environment. Developed guidelines and processes that supported the development of a suite of data products included, but not limited to: extensions t existing data models, vocabularies, taxonomies, ontologies, UML diagrams, XML schemas, and metadata tagging. Trained communities of interests COI in the development and implementation of data products. Supported data validation planning and execution. Information Architecture/Data Warehouse Subaru of America Cherry Hill, NJ 01/2007 2/2007 An Information Architect responsible for the logical and physical design of a consolidated Data Warehousing project, which combined a number of production system isolated data mart silos on different platforms ont one Oracle 10gR2 database. Accomplishments included: assessment of existing data assets. This involved reverse engineering each database, documenting data items including definitions , assigning data types according t Oracle conversion rules, assigning logical data items names, as well as capturing declarative defaults, check constraints and primary and foreign key references. Embarcadero's E/R Studi 7.2 and CA's Erwin 7.1 products were used. Part tw involved importing data int a Metadata Repository. The third step included executing queries t produce a result set based on the physical object's logical name and data type characteristics. The forth step is involves reverse engineering the fully annotated Metadata Repository back int an upper CASE tool. Additional steps involve uniquely naming each database object tables, columns, views, triggers, storage, defaults, check constraints, indexes, domains, relationships from the Metadata Repository. Final steps involve model review and DDL generation. Please note that all of this was accomplished quickly. This is achievable regardless of database platform or size as long as the methodology outlined is followed. Data Architecture/Data Warehousing Dun Bradstreet Parsippany, NJ 07/2006 9/2006
- Information Architect for the world's leading source of business information on a new transactional Data Warehouse for D B's global commercial database which, upon initial load will contain more than 100 billion business records and detailed trade information from purchasers and suppliers. Responsible for both the logical and physical design of the database for Phase 1 and Phase 2 implementation. This project was developed in an Oracle 10g RAC environment, with very large partitioned tables. Erwin 7.1 was used as the modeling tool.
- Information Architecture/Data Warehousing ESPN, Bristol, Connecticut 03/2005 3/2006
- Responsible for the back-end redesign of an undocumented, Oracle v6 legacy database consisting of 4,000 independent tables resulting in a data element inventory in excess of 20,000 items. The new system will be delivered using Oracle 10g R02 in a .NET environment. There were numerous Challenges. The accomplished Approach:
1. Discovery/Capturing the Current State:
- Assessment of existing data assets included reverse engineering each database, schema by schema. However, less than 20 of user tables had primary keys s foreign key relationships were nonexistent. Instead, unique indexes were used instead t create relationships between tables. This task was further compounded by the use of nulls within indexes unique and non-unique. 80 of all data items had been defined as null . Each base data item was prefixed with the table's name. This resulted in the re-defined names using the tool's role-name capability. Table and column usage scripts were executed t determine what actually was used and what could be eliminated.
- Tw sets of pictorial results were presented t each functional manager for verification:
- The as-is state depicting implied logical relationships.
- A base-lined state depicting implied logical relationships, the removal of tables and columns n longer in use, correctly specified null status for columns and a table indexing strategy.
Findings and recommendations were presented t senior management along with implementation plans. Additional accomplishments:
1. Wrote detailed design standards covering:
The need for standardization
The correct and incorrect usage of every database object
Table and Column naming conventions, industry de fact class words and abbreviations
Principles of Data Warehousing Star Schema, Snowflakes Facts and Dimensions, slowly changing dimension SCD types 1, 2 and 3.
Denormalization and the use of roll-up tables
Standards were published, rolled out and enforced t a 200 application development team.
2. Purchased an initial ten seats each of the Embarcader Technology Suite of tools ER/Studi Enterprise w/ Repository , DBArtisan , Describe t enable the Data Architect, DBA team and Application Development teams t begin t effectively document and communicate with one another. Later, I negotiated concurrent license agreements with Embarcader Technologies, which resulted in a 70 savings t ESPN.
Conducted formal training classes in addition t one-on-one training on tool usage t more than 60 developers. Once complete, the use of Microsoft's Vis and the de-supported Oracle Designer as tools were shutdown.
- Administered the E/R Studi Repository t ensure complete database lifecycle support, enterprise and metadata model management.
- Managed a team of data architects.
- Enforced and policed standards and the reusability of existing objects.
- Facilitated database design review.
- Introduced Software Configuration Management Standards t limit development access t secured environments QA, UAT, PROD .
- Performance tuned PL/SQL code.
- Information Architecture/Data Administration Confidential
- Member of the Architecture Resource Management Division ARMD , the bank's central point for the analysis and implementation of information, data, database architectural designs and integrated system solutions. A number of platforms were supported: Mainframe DB2, Sybase, Oracle, UDB, and SQL Server. Responsible for the logical/physical design and back-end maintenance of 48 applications sourced in multiple locations NY, West Paterson, Teaneck, Delaware, Syracuse , amongst them:
- Capital Funding Mutual Funds Accounting and Administration Consumer Loans Data Mart
- Private Client Web Application Project Quality Assurance Corporate Trust Investor Reporting
- Customer Information File Contingent Convertible Debt Bond Call Lottery System
- Fund Pricing System Corporate Trust Environmental Daily Overdraft Monitoring
- Electronic Document Management
- Additional responsibilities included creating Meta data databases, data mapping, data conversions and creating index strategies for performance.
Data Modeling/Data Mapping/ Data Admin
- Responsible for migrating several disparate financial services repositories VSAM, DB2, UDB , and multiple external vendor feeds Bloomberg, Moody's, FTID int a single OTS product SQL Server back-end t maintain static securities reference data. This was part of a Morgan Stanley initiative t align and centralized major IT components int a central source in an effort t eliminate the cycles of manual interventions and reconciliation. Activities include:
- Analyzed Fixed Income legacy systems interfaces t document data flows, data stores, copy-books. Analyzed COBOL programs t perform source t target data mapping: existing t new database, new database back t existing systems VSAM, DB2, UDB .
- Created an metadata database as a working repository t store mapping results, business rules and t execute reports used for Gap Analysis and t provide the detail add-ons needed for the vendor's generic model. Wrote SQL queries, which were used t provide the data transformation logic for the Informatica ETL layer.
- Prepared presentation materials for senior management
Data Analyst Cendant
- Responsible for creating the baseline back-end architecture used t replace 8 legacy systems. This SQL Server 2000 database was used t host a new web-based front-end Single Point of Entry SPE application for Cendant's Hospitality Hotel sector. Data consists of static property data, rates, rate plans and room inventories. Duties included:
- Analyzed legacy system interfaces t document data flows, data stores as an aid t create non-existing database relationships. Reversed engineered each Informix 7.1 production database and created primary and foreign key relationships t depict a pictorial view of the as is model.
- Forwarded engineered each physical model t its logical equivalent, assigning logical names t all tables and data items.
- Created a consolidated logical data model, combining all models int one. Redundancies and obsolete database objects were eliminated t create the final Reconciled Data Model.
- Designed, created and loaded details from all models int an metadata database which captured the legacy system data dictionary details and cross-referenced them t the new created reconciled structures.
- Coordinated offshore activities with application teams in India.
- Responsible for the redesign and data conversion of several Informix and Sybase databases t Oracle see below t improve performance and align with the departmental objective t maintain all of the distributed databases on a single, more cost-effective platform solution.
- Supported eCommerce B2B, B2C web-based applications as follows:
- Reversed-engineered production databases using CA's ErWin 3.5.2 and 4.0 and Embarcadero's E/R Studio. Created pictorial views of the database: a Physical Data Model Queried system catalog and user tables. Met with application teams t review results and obtained sign-off of the table design based on new requirements and existing usage.
- Conversion activities included massive clean up, redesign of tables and the elimination of database objects tables, columns and stored procedures for which there was n longer any usage. Often new tables were added t replace and/or enhance obsolete functionality.
- Data integrity checks were performed t assure that there were n data-type mismatches and converted data landed correctly. Ensured phase 1 and phase 2 of HIPPA The Health Insurance Portability Accountability Act compliance.
The following databases were successfully converted:
- Web Activity Reporting WAR Informix t 8i Quality Management Reporting QMR Informix t 8i
- Performance Management Reporting PMR Informix t 8i AutoFax Sybase t Oracle 9i
- ILF/Self-Documenting, Self-Describing Error Codes ExPert Sybase t Oracle 9i
- Project Tracking System Sybase t Oracle 9i
- Multiple Roles Prudential Financial Roseland, NJ 5/1997 9/2001
- Data Modeler als performing role of DBA and Data Architect
- DBA als performing the role of a Data Modeler
Management - Project Lead/Mentor
- Was responsible for the design and build of numerous systems from development t promotion for hand-over t the Corporate Production team during my 4-year contractual tenure at Prudential. As a member of the centralized system services group, was heavily involved in driving design, selection, construction, and implementation of data base applications and infrastructure services. Participant in all application development projects and initiatives t engineer/design Unix and MVS systems that supported in-house developed and third party applications. Reported directly t the Vice President of Data Assets with dotted-line matrix reporting t other project executives.
- Lead Data Modeler: Created, maintained, and enforced standard written procedures for the physical build of Sybase, MVS/DB2, and UDB/DB2 database objects schemas, tables, table spaces, RI triggers, stored procedures, defaults, views, indexes, foreign key, primary, unique and check constraints . Included naming conventions. Reuse of existing model components ensured all projects were in alignment with enterprise model, thus federated systems could be produced using a strict feed it/read it methodology. Provided database and data management reviews from a technical and functional perspective across a number of platforms. Formally trained junior modelers and new staff. Generated appropriate DDL for the creation of more than 20 different systems. Wrote all non-declarative RI triggers. Model types include: Conceptual Models from written requirements Forward and Reverse-Engineered Models Stage-In/Stage-Out Promotion Models for ETL Data Warehouse Models Physical Models UDB RI Exception Models Federated Warehouses Development DBA: DBA on one DB2/UDB and six Sybase development projects: Created and maintained database objects, in a Unix development and UAT environments including database capacity planning and sizing physical detailed volumetrics . Responsible for the development and execution of conversion scripts, initial load of static data, exports, backups, RUNSTATS and REORGS. Maintained application database level security facilitated query and application performance and tuning familiar with messaging layer and n-tier architectures.
Created standard Project Initiation, Statement of Work SOW documents, Service Level Agreements, and Project Scheduling templates for the centralized data base group. All data base activities above were part of the matrixes required t achieve and then maintain level-2 CMM compliancy, specifically: reusability and repeatability.
Selected Systems and Applications 1
Data Modeler and development/UAT/QA DBA for an enterprise-wide, 24 7 available Customer Relations Management CRM initiative. This vendor, trailblazing complex architecture used Clarify's first UDB/DB2 package solution running on IBM's AIX with RS/6000 SP Enterprise servers resulting in a 1200 logical volume file system. Components included:
High Availability HACMP/ES with RS/6000 Cluster Technology
- EMC Symmetrix Remote Data Facility SRDF
- A Symmetrix Business Continuance Volume BCV functioning as a mirrored media for a protected storage environment. Used as an interim solution for Cognos reporting.
- MQ series messages and BEA's Tuxed software housed on separate application servers used for communication. Tuxed managed workload balancing.
- Created the conceptual design consolidating several Business Unit's requirements int a single physical model addressing both design and very high volume performance issues: 150K new Cases and 105K Service Requests processed daily:
- Debugged back-end challenges associated with Clarify's first rollout release on UDB.
- Wrote detailed Clarify Database Installation Instructions in which custom UNIX command-line scripts and Clarify utilities were executed:
1. Created the Clarify EEE UDB database on a single SP node,
2. Applied the 25 database customization creating 150 new tables for a total of 604 tables per instance
3. Mapped tables t custom SMS and DMS table spaces for the UAT/DPR and Production environments,
4. Updated database confirmation parameters for performance, created performance indices,
5. Build and loaded complex and simple static tables including 10,000 call center employees ,
6. Imported Clarify user-defined forms, pop-up lists and corresponding source code,
7. Ran conversion scripts,
- Created and supported 15 databases in the development, UAT, Quality Assurance, Training and Production environments.
- Worked on the Clarify UDB/DB2 optimizer development, UAT and production teams t design, build, test, and implement new performance capabilities on which multiple databases in the UAT/DRP and Production environments were built. Worked extensively with the infrastructure's back-end server components.
- Duties included the development and routine testing of disaster recovery components plans for this application. Simple validations consisted of notification test t assure all branches of the tree were still attached. Full-scale exercises covered several days and numerous iterations t test DRP failover/failback. This required working many rotating days/nights, weekend and holiday shifts.
- Provided backend support for 30 application developers. Supported rollout release for each Business Unit from development t production hand-over.
Data Modeler and Development DBA for several applications including Time Tracking, Customer Warehouse, Policy Administration systems and Salesforce Management systems. Additional skills:
- Executive Level Reviews and Presentations Configuration Management Structured walk-through
- Quality Assurance Change/Release Management Project Management
- PVCS Version Control Disaster Recovery Implementation Point-in-time Backup Recovery
- Data quality/Conversion Requirements Unix shell scripts Tivoli Alerts
- Run Book Creation Knowledge and Skills Transfer
- Project Manager Prudential Health Care Roseland, NJ
- Implementation Specialist and mentor responsible for the corporate rollout of Project and Process methodologies t the 750
- Prudential associates involved in all phases of software application development. The SEl's Software Education Institute CMM Capability Maturity Model was the initiative by which progress would be measured.
- The goal was t improve the maturity of the software processes from ad hoc and often chaotic processes t mature disciplined software processes. The tools of choice were LBMS's Process Engineer later, Platinum, now CA and Microsoft's Project Manager. Responsibilities: writing training and implementation plans and procedures, workstation installations, conducting training, software customization. Coach for breakthrough projects.
Data Analyst FISA/City
Data Analyst responsible for MVS/DB2 database design and program specifications for a batch Worker's Compensation System that interfaced with the City of New York's Payroll/Benefit Management System and the Mayor's Law Department. System was mandated by the Governor of New York track benefits and expenditures of the 200,000 uniformed personnel police officers, firemen, EMS workers employed by the City of New York. Emphasis was focused on the complexity of union contracts held by employees wh spent extensive time away from the job due t injuries sustained in the line of duty.
- Was Responsible for the selection of the Corporation's CASE tool. Chose LBMS as the then best-of-class product. It supported multiple RDMSs, had bi-directional communication with Visual Basic, and easy desktop integration with Microsoft products and other major software vendors.
- Built a Corporate Repository of metadata and warehoused data of legacy systems using LBMS, reverse engineering data contained in SQL/Server, DB2, Sybase, Lotus Notes and Access databases. Source code, stored procedures, JCL, Word documents, Visual Basic and CICS screens, Business Rules, Excel Spreadsheets and Lotus Notes were linked t data and processes embedded in the repository. Documented Business Rules. Impact Analysis reports traced each instance that a data item was found. Presented the capabilities and usefulness t upper management.
- As System Administrator, was responsible for product rollout and its certification by Corporate LAN Administration. Wrote installation instructions for LAN personnel. Created standards for tool's usage: Configuration Management procedures included access control, object versioning, backups/restores, and data import/export. Wrote material t train Data Analysts in logical model management, DBAs in physical design table spaces, segments, DDL generation, database sizing, stored procedures and RI triggers , and Business Analysts in event/process modeling, and creating pseud level specifications for hand-off t developers. Visual Basic Developers were trained t promote designs t the Repository. Demonstrated that 75 -workstation rollout provided a comprehensive environment in which application development could finally be managed.
- One of four managers contracted t develop an in-house Telephone Acceptance Plan, a joint venture between SNET and MasterCard International t provide its 100-million business/consumer cardholders with the ability t place Long Distance calls using their MasterCard on mainframe DB2. Responsible for the largest sub-system: Customer Account Management CAM e.g., the Cardholder Account table contained 100Mrows, the Personal Identification Number PIN table 350M rows and the Maintenance Log 750M rows. Accomplishments:
- Managed a development staff consisting of 15 senior level programmers
- Responsible for building the Logical and Physical Data Models using IEF as the CASE tool. Generated DDL, performed benchmarks, and supervised base table loads.
- Built system-to-system interfaces
- Wrote the System Integration Test Plan and supervised the user, system, and integration testing with a team of 12 staff members.
- Held a number of positions while assigned t work on a national system developed on a CICS MVS/DB2 platform. System provided fiscal compensation t AT T users that included the Hotel/Motel and Military Industries, Penal and Educational Institutions, users of public telephones and private pay telephone companies. COPA Commission Payment Administration System consisted of 125 batch programs and 105 online functions. Accomplishments:
- Was responsible for the redesign of COPA Release 2.0 Logical and Physical Data Models utilizing Excelerator Created business definitions for data entities and attributes. Maintained logical t physical cross-referenced models. Created Process Model containing screen and report designs, data flow diagrams, volumes, and edit rules.
- Responsible for System testing four releases of COPA. Worked evenings and night tours trained and managed a staff of five. Wrote complex queries using SPUFI, QMF and PRF t verify data, balance monthly commission batch jobs and validate results.
Managed a development team of seven senior programmers. Designed and implemented a prototype Voucher System for the Alternate Sales Force. Was responsible for developing data and process models designing structured program specifications, commission schedules, and complex commission payment programs. Conducted JAD sessions t gather data requirements, led design team, facilitate code walk-throughs. The system was developed using INFORMIX, 4GL, IEF, and ISQL. Was responsible for training the users of the system.
Data Modeler Bell
- Responsible for the creation of a Corporate Enterprise Data Model for the new development of data driven systems needed t support this company's explosive growth as a major cellular telephone service provider. Responsible for the Product, Vendor and Material data subject areas. Created Logical and Physical Data Models.
- Tactical Planner/ Data Modeler AT T Consumer
- Developed and wrote Tactical Plans for the implementation of a Market Research System and an Advertising Optimization Model. These plans were based on requirements that supported the long-term strategic objectives for building information systems. For both systems, was responsible for creating the logical data model, project costing using function point analysis techniques, project planning and tracking, defining the technological architecture, activity decomposition as well as user interviews and system inventories
Data Base Administrator AT T Communications
- Database administrator/designer responsible for creating IDMS databases for an interdepartmental architecture required t verify 20 billion associated with Access Charges and Revenue expenses. Accomplishments included:
- Central DBA on a task force convened t conclude the normalization process, assign common naming conventions, for the logical and physical data models. Effort entailed extensive interface with ten project DBAs, and supervision of a data entry staff.
- Constructed fully annotated operational and linkage table diagrams. Expanded structures during the creation of IDMS Network Structures. Generated usage views, access paths, and quantitative details for navigation through the data model. Volumes, retrieval sequence, batch requirements, and activity frequencies were included.
- A Key participant in the Requirements, Build, Test and Installation phases for major enhancements t an IMS table-driven National Order Processing System. Achievements included:
- Responsible for administrating the physical User Acceptance Test environment and the overall technical assessment of the UA T system. Duties performed: installing and initialization of test database authenticating certified code, executing job-streams and verifying transactions. Conducted formal UAT tests.
- Produced high-level conceptual database and systems designs, data flow diagrams, programmer specifications, data dictionary entries, screen designs MFS and report layouts. In addition, wrote and packaged the logical user requirements which established the baseline for systems development for five major releases of this system.