Sr. Enterprise Data Warehouse Resume
OBJECTIVE
- To achieve meaningful employment with a company as a Sr. Data Warehouse/Data Modeler, Data Analyst/ETL Developer in INFORMATICA which will allow me to address and solve your company’s technical needs, by applying the training and experience within the areas of Logical Data Mapping, Databases, Business Intelligence (BI) and ETL, Data Warehouse/Modeling Development.
SUMMARY
- 10 + years using Oracle data base and Oracle Tools (Like TOAD, SQL Developer, SQL Navigator)
- 1+ years using Microsoft SQL Server 2005
- 6 + years with ETL Development and Design using Informatica’s PowerCenter ETL tools.
- Experience with Informatica Power Center v9.1
- Very solid understanding of Enterprise Data Warehouse Architecture, Data Modeling, ETL design specifications and mappings. (Velocity, Inman and Kimball)
- Very strong understanding of ANSI SQL, Table Joins, RDBMS architecture, Query tuning
- Very strong knowledge of Data Modeling, Normalization, Dimensional Modeling, Star schema, Fact and Dimensions
- Excellent communication and facing skills with the business units, and IT groups in interactive design sessions to design the solution.
- Comfortable and confident to do white board sessions with business units and IT groups.
- Self - motivated and self-directed and require minimal supervision.
- Strong SQL skills and Data Analysis
- Excellent working habits, Project/Task Oriented/Deadline do what it takes to get the job done and meet timelines.
- Informatica Certified Developer - Power Center, Administration
- Microsoft Certified IT Professional - Business Intelligence and Data Base Administrator
- IBM Certified Solution Expert - Cognos Business Intelligence
TECHNICAL SKILLS
Software: ETL INFORMATICA v9.1 Power Center, Repository Manager,INFORMATICA Data Profiling (IDE), INFORMATICA Data Quality (IDQ), SSAS, SSRS, SSIS, SAP BW/BI 3.5, TOAD 10.6, SLQ DEVELOPER,UC4,WORD, EXCEL, CRYSTAL REPORT, BUSINESS OBJECT, IBM CLEAR CASE AND CLEAR QUEST, HP QUALITY CENTER 10.0 (MQC), ER STUDIO, ERWIN
Languages: SQL, PL/SQL, SQL-PLUS, T-SQL, SQR, MICROFOCUS COBOL, POWERBUILDER, COBOL, RPG,CICS
Databases: Oracle (11G), DB2, Sybase, MS SQL Server 2005-2008, MS Access, IDMS, IMS, VSAM.
Operating System: Windows 7, XP, Windows 2000, IBM AIX (RISC 6000), UNIX.
PROFESSIONAL EXPERIENCE
Confidential
Sr. Enterprise Data Warehouse/ Modeler/ETL Developer
Responsibilities:
- Responsible in enterprise-level business intelligence and data warehousing including the architecture, design and deployment of scalable and highly available solutions in the Data W arehouse environment r elational / dimensional data models, schemas in the ERD.
- Gather Business Drivers / Requirements for the Enterprise Data Warehouse and translation into technical solutions.
- Develop and maintain Extract Transform Load(ETL) jobs for the Enterprise Data
- Responsible for maintenance of PL/SQL and Oracle databases (11g )troubleshooting and performance tuning skills.
- Responsible for maintenance and development of Oracle ERP Applications and underlying database tables.
- Collaborate with business owners to successfully implement and maintain an enterprise level business analytics and data warehousing solution
- Develop front-end, metadata and ETL specification to support business requirements for reporting across the organization
- Manage and ensure project delivery
- Responsible for the application maintenance of both Production and Non-Production environments
- Troubleshoot Production issues, identifying root causes and planning remediation.
- Develop reporting standards and best practices to ensure data standardization and consistency
- Perform data cleansing and data auditing as necessary to ensure data quality
- Define and develop physical and logical data models to support departmental and functional reporting needs
- Create, maintain, and manage documentation for all data warehouse and reporting development efforts. Help to create and implement a long term storage and business continuity strategy including backup & recovery and data storage and archiving.
- Follows instructions and performs other duties as may be assigned by supervisor.
- Assists other employees in accomplishing Huntsman company goals.
- Participates in and completes company-required training programs.
- Participates in Environmental, Health and Safety initiatives as set forth by the
- Work with business managers across the organization to define, plan and develop further BI / reporting capabilities .
- Develop and maintain a long-term roadmap for solutions
- Design, manage and implement major upgrades and expansions.
- Ability to work effectively on multiple priorities.
- Excellent team-player with superior interpersonal skills who can work closely with both technical development teams and business users.
- In depth understanding of data warehouse concepts, methodologies and infrastructure
Confidential
BI Test Lead
Responsibilities:
- Coordinates all aspects of test planning, scenario definition and execution
- Carries out procedures to ensure that ETL, DW, and BI systems and services meet organization standards and business requirements
- Develops and maintains test plans, test requirements documentation, test cases and test scripts
- Develops and maintains test data sets
- Verifies compliance to commitments contained in the test plans
- Works with project management and development teams to resolve issues
- Communicates concerns, issues and problems with data
- Leads testing and post-production verification efforts
- Executes test scripts and documents and publishes objective test evidence results
- Investigates and resolves test failures
- Utilized a De-Normalize and 3Normal Form(3NF) Oracle 10 database structures
- Utilized ERwin Data Modeling to create test data base
- Followed concepts of Kimball Best Practice of developing Data Warehouse.
Confidential
Data Analyst/ETL Developer
Responsibilities:
- Create Logical Data Mapping (LDM) to analyze, define, design and develop Staging and cleansing Oracle Database using FACT and DIMS structure as the database models (i.e. relational and star schema). Create required data conversion process designs. and own data dictionaries
- Create LDM to analyze, define, design and size ORACLE data warehouse as target for ETL and will be used by Supplement Enhance Analytics Reporting System (SEARS).
- Creation and project implementation of logical data mappings and develop database sizing estimates.
- Create LDM’s to define and design different Database tables across Supplement Enhancement Analytics Reporting System (SEARS) with source data coming from Epsilon and Entegrate (SOLARC Right Angle-SRA) a SunGard Software, csv, flat files sources that will be used in ETL.
- Responsible for finding the solution to ETL exception log errors and warnings related to data from source and target ORACLE tables, source qualifier, transformation, and lookup tables from Epsilon and SRA, Pre and Post Mat Lab csv files for MVaR model which is being implemented throughout BP IST.
- Work with Data Architect regarding definitions of ORACLE table columns, attributes, primary keys, index, referential and integrity constraints.
- Work with Business Object developer to define the Universe of Oracle fields that will be use to develop different reports as required by SEARS primary users.
- Create data conversion scripts, stored procedures and complex SQL.
- Validate business/functional requirements against data model, scripts, SQL, etc. Create and conduct unit testing. Create a data quality strategy.
- Used Star Schema with multiple Fact and several Dimension.data base structures
- Assist Data Architect in creating data models utilizing ERwin
- Reconcile record counts total loaded to the target table and record counts from the ETL source tables. Work with Business Analyst in defining source and target tables to meet SEARS user data requirements.
- Work with Primary User Analyst regarding Market Risk Price Attributes curve SBU rollup changes in SOLARC and building the SQL to report all the curve mappings assignment changes.
- Responsible for creating test plans and test sets of all Change Request(CR) in preparation for migration using Mercury Quality Center,(MCQ), IBM Clear quest, and Remedy. Also
- Do test validation of all changes done in migration. Assist with data migration plans.
- Provide query optimization across the project. Liaison between Project team and Database Administrators
- Support User Acceptance testing (UAT)
Confidential
Data Analyst /ETL Developer
Responsibilities:
- Develop a Normalized.data base structures with PK and FK
- Develop Test data models utilizing ERwin
- Create scripts to create, drop and alter data base tables for the DBA’s
- Help the DBA’s in the Conceptual, Logical and Physical Data Base design of the Integration Data Warehosue,
- Followed concepts of Kimball Best Practice of developing Data Warehouse.
- Create LDM’s to analyze, define, design and size Oracle data warehouse (16 tables) as target for the ETL and use by Data Integration for profiling and cleansing.
- Create LDM’s to analyze, define, design and develop Staging and cleansing Oracle Database using de-normalized structure as the database model
- Create LDM’s to define and design different Database tables across Lawson, csv, flat files sources that will be use in ETL
- Responsible for the ETL of Lawson Item Master database to INFORMATICA for data cleansing efficiency using the Informatica Power Center development tools. (Designer- mappings, transformations, source and target definitions. Workflow Manager - Create mapping sessions, Workflow to organize overall flow of data, workflow objects, schedule and physical database connections, Workflow Monitor - view running sessions, runtime statistics, history of past workflow runs, stop, abort, resume or restart jobs.)
- Profile (IDE) tables on columns that are selected for cleansing, analyzing patterns, frequency and percentage of occurrences. Add referential and source tables that will be use in IDE and IDQ
- Create different mapplets rules that will be use for cleansing (IDQ) utilizing different IDQ functions.
- Create IDQ mappings with mapplets for cleaning selected table columns of data.
- Work with DBA in defining and creating target and source Oracle tables that will be use in ETL, IDE, IDQ
- Coordinate with Business Unit users on setting up Business rules on columns selected for IDE and IDQ.
- Analyze the different Lawson Item Master Tables for data and referential integrities.
- Responsible for defining, designing and creating table that will store cleanse data to be posted back to Lawson Item master production database.
- Help assign task to different team members to focus on Data Integration Group deliverables and expectations.
- Assist in developing documentation of different ETL mappings, data profiling and data quality mapping for Data Integration Group Project.
Confidential
Financial Reporting Specialist
Responsibilities:
- Responsible for preparing data, reports, files, data base needed to do Month End and Calendar close. Prepare data mapping and migration from 3rd party tape to Servicing Valuation Model.
- Create data valuation file to 3rd Party for servicing valuations and reporting on Purchase
- Mortgage Servicing Rights (PMSR), Design, test, and automate new reports as required.
- Production maintenance of existing Access databases.
- Prepare monthly updates to Cognos Finance and reconcile data to Financials.
- Export data monthly from AS400 to Access database. Prepare monthly detailed job cost to general ledger reconciliation.
- Monitor Task Schedulers to verify job success. Create and reconcile monthly corporate advance reports.
- Update Risk Assessment Default Analytic Reporting System (RADAR) tables via
- Accounting Load (previous servicer advances, arrearage, Loan
- Servicing Accounting Management System (LSAMS) recoveries, and settlements
- Prepare and analyze the Non-Paid-In-Full Trailing Expenses report
- Prepare quarterly Foreclosure Reserve Analysis
- Design and automate other various reports as required
Confidentia
Sr. Data Cleansing Analyst / ETL Developer
Responsibilities:
- Identify the data quality problems from the AS400 legacy system like duplicate data with wrong city name, state, zip code using the US Post official addresses FTP to staging Oracle data base.
- Provided business analysts and data stewards with sample and reports on data that needs to be cleanse on the AS400 legacy system.
- Address data issues discovered during data analysis like wrong customer type, pickup dates, customers that was never billed because wrong address, wrong trash bins, pick up on bill address rather than physical trash pickup address.
- Set up meetings with Field resources to address data issues, concerns and deadlines to meet the SAP market area implementation.
- Based on business requirements fielded by Field resources and other project team members, develop additional data cleansing reports, queries or modify existing ones in Access.
- Interface with Field resources to resolve cleansing questions, testing reconciliation, quality and progress with data cleansing and issues concerning the project.
- Developed and Unit Test SQL procedures on existing Access Tables and generate reports given to the Field resources cleansing responsibilities.
- Create LDM to analyze, define, design and size Oracle data warehouse(26 tables) as target for the ETL and use by SAP for conversion mapping
- Create LDM to define and design Oracle Staging tables using de-normalized structure as the database model for SAP conversion and reports to be fielded to the different Market Directors for data cleansing
- Create LDM to define and design Database, csv, flat files as sources that will be use in ETL
- Developed and populate Access data base tables ODBC from Staging Oracle tables use in data cleansing for all Market and National Accounts Area.
- Convert all Access databases SQL to INFORMATICA for data cleansing efficiency using the Informatica Power Center development tools. (Designer- mappings, transformations, source and target definitions. Workflow Manager - Create mapping sessions, Workflow to organize overall flow of data, workflow objects, schedule and physical database connections, Workflow Monitor - view running sessions, runtime statistics, history of past workflow runs, stop, abort, resume or restart jobs.)
- Assist current data cleansing resources in writing reports, run queries used to cleanse all Market Areas and National Accounts.
- Develop, support and maintains current daily production Workflows assigned to data cleansing.
- Develops and maintain source and target Oracle table definitions in designing new Informatica maps.
- Provide directions and answers to the Business Unit concerns regarding data cleansing.
- Created progress summary report submitted to Data Cleansing manager regarding market area cleansing
- Provided Oracle table indexing to DBA needed to execute efficient SQL queries.
- Created an Excel spreadsheet to data map data cleansing Oracle table columns to the corresponding table columns in SAP.
- Attending meetings with SAP data mapping group to address new changes, requirements and progress on SAP development.
Confidential
Senior IT Analyst
Responsibilities:
- Developed Oracle, SQL Server 2000, and DB2 relational data base design used as a back end for a different WEB Application.
- Define, design, size and develop Oracle Marketing Information System (MIS) data warehouse (45 tables) to combine both residential and small business customers. This data warehouse is use as datamarts for other business unit and backend for intranet and internet web sites.
- Define, design and develop Marketing Information System (MIS) as a normalized Database model use as data repository
- Developed PL/SQL store procedures, triggers for data base updates, and views for query, and reports.
- Have used AIX to execute PL/SQL procedures and other table queries against Oracle data base exposed with database development support for interfaces to external sources using XML.
- Developed, implemented, documented, and maintained batch update programs for enterprise-size database applications.
- Identify the data quality problems from the Mainframe legacy system like data with misspelled city name, state, zip code using the US Post official addresses on both CIS residential and BES small business customers.
- Provided mainframe application developers and Business units data owners with reports and sample data using MS Access database that needs to be cleanse on both CIS residential and BES small business customers.
- Address data issues discovered during data analysis like wrong move in and move out dates, credit ratings, deposit requirement flag, meter reading flag and meter type.
- Develop Excel spreadsheet to show progress summary of data cleansing development and set up meetings with CIS and BES customer data to discuss cleansing status, issues, concerns and schedule
- Analyze and help develop Oracle tables that will combine both legacy data in an oracle data warehouse.
- Maintain integrity, accuracy and access of customer’s data used in different company intranet and internet web sites like 15 minutes interval, low and high peak usage, load profile, and rate code by validating and comparing data from Oracle data base against the legacy data.
- Assist Data Architect in creating data models utilizing ERwin
Environment: Microsoft Products (WORD, EXCEL, ACCESS), SQL/Plus, PowerBuilder, TOAD, Quest Reporter, Oracle, PL\SQL, Store Procedures, Triggers, Views, SQL Packages, Micro Focus COBOL, SQR, COBOL, AIX, WINDOWS 2000.