Sr. Data Modeler Resume
Jersey City, NJ
SUMMARY
- Around 8 Years of Experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading and Data Warehousing testing experience.
- Specialized in Requirements Gathering coupled with Business Analysis resulting Conceptual, Logical and Physical database solutions for Relational (OLTP) and/or Dimensional (OLAP - Star Schema, snow flake) Systems.
- Expert in Analysis, BI reporting, Design and Development on ETL (Extraction, Transformation and Loading) mechanism using SQL Server 2005 SSIS, SSAS, SSRS, DTS in complex, high volume Data Warehousing projects in both Windows and Unix environment.
- Systems Analysis, Data Analysis, Segmentation, Data Profiling, Report Generation and Production Support to relational database application reporting.
- Excellent experience in Data Modeling using Erwin r 9/7.3/4.5/3.5.2 , ER Studio.
- Expertise in developing Slowly Changing Dimensions mappings using Type1, Type2 and Type3.
- Familiar with implementation of SSIS (SQL Server Integration Services), Database Mirroring and Service Broker concepts which are some of the new features in SQL server 2005.
- Advanced experience in technical architecture of ETL tools, data staging techniques, source data integration and source-to-target mapping
- Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads, Incremental Loads, Daily loads and Monthly loads
- Experience in Teradata V2R4/V2R5/V2R5.1/TD12/TD13 with Creating indexes, tables, views, macros, collecting stats and stored procedures.
- Experience in using Teradata database Load utilities like Fast Export, Fast Load, and Multi Load.
- Experienced in designing reports using SQL Server Reporting Services (SSRS) and Excel Pivot table based on OLAP cubes.
- Experience in developing data updates, data cleansing and reporting.
- Knowledge of designing Data ware house appliances in Netezza and DTS, NDM data transfers.
- Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, MS Access and Flat Files.
- Extensive experience in Data Analysis, Data Cleansing, Requirements gathering, Business Analysis, Data Mapping, Entity Relationship diagrams (ERD), Architectural design docs, Functional and Technical design docs, and Process Flow diagrams
- Expert in testing and writing SQL and PL/SQL scripts.
- Expert in creating Unix Shell and Perl Scripts.
- Expertise in creating the Data flow diagrams using MS VISIO.
- Strong experience in creating Macros in Excel and MS Access
- Highly experienced with different RDBMS such as Oracle 9i and 10g, 11g, MS SQL Server 2000, Teradata and MySQL.
- Expertise in Erwin, TOAD, SQL Loader and WinSQL tools.
- Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)
- Strong interpersonal skills, ability to interact with individuals at all levels, excellent communication and presentation skills.
TECHNICAL SKILLS
Operating System: Windows NT/98/2000, Windows XP, UNIX and MS DOS
Languages: UNIX Shell, PERL, HTML, XML, XSLT, VB Script, Java2.0, SQL T-SQL, PL/SQL
Database: Oracle9i, Oracle10g, 11g, SQL Server 2005, 2008, DB2, Teradata and MS-Access
Testing Tools: Quick Test Pro and Quality Center.
Utilities: TOAD, Citrix server, Oracle SQL Developer, Putty, VPN, Net meeting and CuteFtp, Teradata SQL Assistant.
ETL: Data stage 8.0, Informatica 8.1/8.5/9.1, Ab Initio (GDE 1.14, Co>Op 2.14), SSIS
Reporting Tools: Business Objects Xi R2, OBIEE, Crystal Reports, Cognos 8.0 Series, SSRS
PROFESSIONAL EXPERIENCE
Confidential, Jersey City, NJ
Sr. Data Modeler
Responsibilities:
- Analyzed the Business Requirements and Rules and worked with Business Analysts and Business Users in preparing mapping documents and involved in Test Plan preparation.
- Participated in JAD sessions with application teams, reviewed and gathered the requirements.
- Analyzed source data and created views on source data, source to target mapping documents to integrate data from different billing systems.
- Created Aggregate Fact tables, history fact tables and type II dimensions for reporting needs.
- Closely worked with Data architect, modeler to understand the architecture and participated in reviewing the source to target mapping documents.
- Developed normalized Logical and Physical database models to design OLTP system
- Created mapping documents for application specific data marts and created views for downstream reporting applications.
- Worked with different types of Sources systems like PeopleSoft, Sql Server, Oracle, DB2, and files like XML files, Excel, Flat Files.
- Developed Data Dictionaries, Metadata, Design Standards, Design Guidelines, Design Best Practices, ERwin Tool Guidelines and SDLC Process Documents
- Strong understanding of logical and physical database designs and also dimensional modeling schemas like Star and Snowflake schema.
- Worked with Teradata and SQL Server Database Administrators to implement staging, integration and data mart DDLs with respect to the warehouse modeling standards in the database.
- Extensively written SQL scripts to validate the data transfer from source to target.
- Wrote Complex SQL queries to support the test case results, also done the SQL performance tuning.
- Performed negative testing of data and created mock data to cover all testing scenarios.
- Worked to set up column level definitions and set other column level metadata in bulk editor in Erwin.
- Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
- Tested the ETL Informatica mappings and other ETL Processes (DW Testing).
- Analyzed and tested Informatica Mappings with OLTP source data, staging and target data.
- Extensively Used HP Quality Center/ALM as test management tool.
- Tested Micro Strategy Reports, Dash boards according to the business requirements.
- Exclusively used the Micro Strategy Web analyst to test the Micro strategy web Reports.
- Tested several complex reports generated by Micro Strategy including Dashboards, Summary Reports, Master Detailed, Drill down.
- Tested the Web application in multiple browsers.
- Participated in daily stand-up meetings, weekly status meetings in Agile methodology.
Environment: SQL Server 2008, Oracle 11g, Netezza, SQL, PL/SQL, UNIX Shell Scripting, Erwin, Aginity Netezza Work Bench, TOAD, MS Excel, HP Quality Centre/ALM 11.0, CVS, MS Project, OBIEE, Micro Strategy, Informatica 9.1,Control-M.
Confidential, Dover, NH
Sr. Data Modeler
Responsibilities:
- Reviewing the business requirements and working with business and requirements teams to understand the requirements.
- Analyzing the business requirement document and create the mapping and analysis documents for data migration.
- Analyzed the source systems, source files and created the transformation rules to load the data from source files in to reporting environment.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
- Created logical & physical data models and Meta Data to support the requirements
- Created the SQL, PLSQL scripts needed for reporting.
- Worked on performance tuning of the mappings and scripts for optimization.
- Created and Debugged the Informatica mappings needed for loading data from source to staging.
- Extensively used Teradata Fast load, multi load, T-pump and Fast export utilities.
- Worked with the reporting teams to create the business views needed for reporting.
- Worked on exporting the data required for business teams and other 3rd party vendors.
- Created the Ad-hoc reports as needed as per the cisco fiscal year.
- Worked with deployment teams to deploy code for data transfer between source and target applications.
- Produced physical data models and suggested proper indexing that promotes fast data retrieval based on BI usage.
- Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
- Worked with the business and deployment teams to identify the frequency of source files and to schedule the informatica workflows.
- Worked extensively on Dollar universe tool to schedule the jobs.
- Work with QA resources to help them understand the mapping design to test the implementation.
- Analyze the defects, source data and identify the cause of defect, and to close the defects.
Environment: Informatica 9.1, Teradata TD13, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, OBIEE, TOAD, MS Excel, Erwin, HP Quality Centre, CVS, XML, MS Project, Dollar universe.
Confidential, Houston, TX
Sr. Data Analyst
Responsibilities:
- Worked with Wachovia and Confidential business teams to gather the requirement to migrate the customers from Wachovia to Confidential .
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications
- Reviewed the business requirements and working with business and requirements teams to understand the requirements.
- Analyzed the business requirement documents and functional specification documents and created the mapping document and migration analysis documents for data migration.
- Analyzed source and target systems and their data bases and created transformation rules to migrate the data from source to target application.
- Extensively worked on different source systems like Sql Server, Oracle, flat files and Main frame systems to understand the input data.
- Worked with source systems technical teams to prepare and send the input data with required data fields to conversion team.
- Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin / Star Schema.
- Extensively worked on using and analyzing source different source input files like vsam, COBOL, text, .csv, excel and XML files.
- Analyzed the target systems to understand their requirement to receive the target data and gathered the information to generate output files, oracle data dump.
- Modeled views/materialized views in Erwin and helped ETL developers in physical instantiation of databases.
- Worked with Source and target systems to create the System Interface Agreement to document the details of data transfer method, server details and data details.
- Conducted all data model reviews as appropriate during project lifecycle
- Created the SQL, PLSQL scripts to help the developers to understand the source extraction logic.
- Participated in migrations strategy, ETL process to load data from source systems to the data warehouse discussions.
- Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
- Used most of the transformations such as the Source qualifier, Aggregators, Connected & unconnected lookups, Filters & Sequence generator.
- Worked with SQL Server Database Administrators to implement staging, integration and data mart DDLs with respect to the warehouse modeling standards in the database.
- Analyzed and explored Business Objects and Cognos for reporting purposes.
- Provided weekly status reports to the client outlining tasks status, issues, and resolutions for the completed workweek.
- Involved in mapping logical data models to physical data models, and coordinated business requirements for data.
- Debugged the Informatica mappings and mapplets to make sure that the transformation logic is implemented correctly.
- Worked with QA resources to help them understand the mapping document, and reviewed the Master Test Plan to test the implementation.
- Ensured effective test coverage of new features, strategic test planning for early identification of bugs, coordinated follow up with ETL dev teams for faster resolution.
- Participated in defect review, status meetings to help the business teams to understand the defects/issues.
- Worked with different business and implementation teams to schedule the game plan tasks for the live conversion weekend and supported the live conversion events.
- Created the migration summary reports and other ad-hoc reports after the conversion event.
Environment: SQL Server, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, SQL Loader, TOAD, MS Excel, HP Quality Centre, CVS, XML, MS Project, Main Frame, Erwin, COBOL II, VSAM, Copy Books, Informatica 8.6,9.1.
Confidential, Milpitas, CA
Data Analyst
Responsibilities:
- Extensively involved in Business Analysis and Requirements Gathering
- Involved in Developing Test Plans and Developed Test Cases/Test Strategy with input from the assigned Business Analysts.
- Designed a Conceptual Data Model, Logical Data Model, and Physical Data Model using ERwin.
- Normalized the logical model to 3rd normal form.
- Analyzed data sources, prepared data conversion approach to load the data.
- Worked on profiling the data for understanding the data and mapping document.
- Designed the Data Base to store the data from various data sources.
- Designed the table structures and worked extensively on tuning for better performance.
- Checked the naming standards, data integrity and referential integrity.
- Created a logical design and physical design in Erwin.
- Worked in importing and cleansing of data from various sources like Oracle, flat files onto SQL Server with high volume data
- Created VBA Macros to convert the Excel Input files in to correct format and loaded them to SQL Server.
- Very Strong in writing complex Sql Stored Procedures and Functions in SQL
- Created SQL, PL SQL scripts and analyzed the data in MS Access/Excel.
- Developed and publish periodic reports from data stored in an SQL database
- Created macros to automate the process of refreshing the data in MS Excel.
- Analyzed the Stored procedures and Sql Queries to improve the performance of queries and speed up the process.
- Transferred data from flat files, Excel spreadsheets and heterogeneous data to and from SQL Server using SSIS packages Dataflow and Bulk Insert
- Scheduling and monitoring the ETL packages for Periodic load in SQL Server job Agent
- Executed the SSIS packages in command line, through Macros with DTEXEC.
- Wrote Stored Procedures for executing the SSIS packages remotely.
- Created/Tested the mapping rules according to the requirements in the ETL Packages.
- Implemented SDLC, QA methodologies and concepts in the Project
- Tested the ETL packages comparing the input source files and Target Table data in SQL Server.
- Validated the ETL transformed data for data completeness, Data correctness and for according to mapping rules.
- Executed Complex SQL queries within UNIX shell scripts in the UNIX environment.
- Wrote UNIX scripts for automating the process.
- Developed Test Summary Reports and participated in GO / NO-GO meetings.
- Completed knowledge transfer to employees enabling them to understand the complete data conversion process.
Environment: SSIS, SQL SERVER 2005, SQL, PL/SQL, UNIX Shell Scripting, TOAD, MS Excel
Confidential, San Leandro, CA
Data Modeler
Responsibilities:
- Worked with data validation, constraints, record counts, and source to target, row counts, random sampling and error processing.
- Wrote MACROS in Teradata to generate less channel traffic and easy execution of frequently used SQL operations and improves the performance
- Identify issues, information and behaviors during the adoption of a proprietary information management system.
- Designed and supported SQL 2008 Reporting services, Integration services and Analysis services.
- Created various reports with different functionalities like drilldown, Charts, Crosstabs using Crystal Reports.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Embarcadero ER Studio
- Involved in testing data mapping and data conversion in a server based data warehouse
- Expertise in data quality, data organization, metadata and data profiling
- Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards
- Conceptualized, designed and implemented an enterprise data integration platform based on scalable search,
- Created Batch processes using Fast Load, BTEQ, Unix Shell and Teradata SQL to transfer, cleanup and summarize data.
- Created ETL documents detailing the ETL (database navigation and data transformation) logic used by developers as coding specifications.
- Scrubbed data to accurately generate customer pull. Provide output files in various file format based on customer request.
- Accelerate the rate of adoption of the system, improve the quality of the data being input and generated, and promote accountability amongst the staff and users.
- Made recommendations on potential functional and technical improvements to planned or existing system components and applications.
- Supported Customer Data Integration projects through Business Intelligence and Data Warehouse system integration
- Involved in Developing Test Plans and Developed Test Cases/Test Strategy with input from the assigned Business Analysts
- Translated written Business Requirements Documents into Master test plan and testcases.
- Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc.
- Verify data movement occurs correctly between each step (data storage) in the ODS. Data Mapping performs as defined in Mapping Specs / Use Case
- Prepare and present solutions for Enterprise Data Integration (ETL) Platform
- Good experience in analyzing data sources, data profiling, data validation, developing low level design patterns based on the business and functional requirements.
- Worked with fixing the production issues for multiple projects and supported night ETL jobs/mappings/batches/procedures.
Environment: Sybase 12.3, Windows XP,Oracle10G,TOAD, UNIX, XML, XSD, XML Spy 2008, Mercury Quality Center, PVCS (Dimensions), Oracle, SQL Server 2008, SQL Server Integration Services (SSIS),, MS Project, Main Frame, MVS, JCL, COBOL II, VSAM, Copy Books, ISPF, OS/390, Control-M, SQL, PL/SQL, XSLT, XQuery, UNIX Shell Scripting, Business Objects XIR2, Crystal Reports, Autosys, MQ Series, Java, JMS
Confidential
Junior Data Analyst
Responsibilities:
- Reviewing the business requirements and working with business and requirements team for gaps found during the review
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
- Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
- Involved in Data mapping specifications to create and execute detailed system test plans.
- The data mapping specifies what data will be extracted from SSIMS source, transformed and sent to an external entity.
- Responsible for Initial data migration testing (Initial Load) from different sources to MDM Database.
- Responsible for Periodical data synchronization (Delta Load) between the source systems and the MDM DB for ensuring proper data synchronization.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning
- Worked with Mainframe CIC screen environment in creating test data for all the MDM services.
- Performed Independent MDM Service testing by preparing xml transactions.
- Worked with various data sources with Multiple Relational Databases like DB2 UDB and Oracle to perform data validation.
- Extensively used Oracle Pl\Sql developer to query oracle database.
- Tested UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
- Tested XML’s part of MDM SOA architecture.
- Compared and Tested Source data with XML Output flow.
- Worked on Teradata, DB2 databases using queryman and qmf.
- Reviewed DataStage Mappings and test cases before delivering to Client
- Created ETL test data for all ETL graphs based on transformation rules to test the functionality of the application.
- Involved in extensive DATA validation using SQL queries and back-end testing
- Reported bugs and tracked defects using Quality Center 8.0 (Test Director)
- Written test scripts to compare OLTP vs OLAP for ETL testing.
- Worked as a POS tester for Java upgrade QVS uplift and IBM OS installation.
Environment: SQL, PL/SQL, IBM Datastage,DB2 UDB v8,QMF, Oracle,IBM Web Sphere MQ, PL/SQL Developer, Teradata Sql assistant 7.2,IBM DB2 v9,Attachmate Extra, Putty, HP Quality Center 9.0,CDBS 6.0,QMF 8.1,firstobject XML 2.4,XML,IB M AIX, UNIX, FileZilla 2.2,WinSCP 4.0, Ultra Edit 15.0,Secure FX, Secure CRT.
