Sr. Database Developer/administrator Resume
Dallas, TX
SUMMARY
- Over 11 years of experience in multiple roles such as Data Analyst, Data Warehouse Architect and Programmer/BI Analyst. Well versed with a range of technologies including Data Warehousing and mining applications, data warehouse conversion and integrations projects, Artificial Intelligence and Neural Networks, Client/Server Applications (OLTP), Service - Oriented frameworks, database systems, Enterprise Integration and Object Oriented Application Development (OOAD).
- Strong analytical and problem solving skills, with in depth technical knowledge and effectively applying business solutions to improve productivity through re-usable centric design.
- Extensive experience in Netezza (MPP) and Oracle PL/SQL.
- Worked with Dimensional Data warehouse in Star and Snowflake Schemas, created slowly changing (SCD) Type I/II/III dimension mappings.
- Strong ability to understand and maintain complex database schema objects (View and Stored procedures).
- Strong background in performance tuning (SQL, PL/SQL) using oracle Explain Plan, SQL Trace, and Oracle Hints.
- Experience with variety of BI tools including Tableau, Cognos, and DataStage.
- Experience with data related technologies including Hadoop, Spark, and R.
- Experience with data mapping, data profiling, and data quality.
- Experience with data integration issues (validation and cleaning) and familiarity with complex data structures.
- Experience with Enterprise Scheduling tools such as Tivoli Scheduler.
- Infrastructure background working within the cloud space ( Microsoft Azure )
- Working experience with Python, R, SQL, and PowerShell.
- Experience with release/change management.
- Experience with Waterfall/Iterative/Scrum/Agile development methodology.
- Expertise in writing efficient advanced and analytical SQL.
- Experience delivering and creating strategic roadmap for reporting and analytics.
- Experience in project life cycle activities on DW/BI (Data warehouse and Business Intelligence) development and maintenance projects.
- Strong knowledge of ETL concepts (ETL Design, Development & Build) especially as applied to DataStage and SSIS.
- Strong knowledge of data analysis, data transformation, conceptual data modeling, data transformation and metadata management.
- Implemented industry BI standards and best practices.
- Experienced in Logical and Physical data modeling of staging and warehouse environments using Data modeling tools like ERWin, Relational Rose, and Oracle Designer.
- Highly experienced in preparing Project Estimates and Project Plans.
- Experience in interacting with Business Users in analyzing the Business Process requirements and transforming them into documenting, designing, and rolling out the deliverables.
- Exposure to multiple industry domains like Finance and HealthCare,
- A Self-starter with strong verbal and written communication skills, a positive attitude, willingness to learn new concepts and acceptance of challenges.
TECHNICAL SKILLS
Operating Systems: Win’95, Windows NT 4, Win XP, UNIX, Windows 10
Data Warehouse Tools: Hyperion Planning, HFM, Essbase, DataStage, Cognos, Microstrategy, Tableau, SSIS, SSRS, SSAS, Xcelsius, Business Objects. ODI
Data Modelling Tools: Erwin, Relational Rose, Oracle Designer, IBM Infosphere Architect.
Database Mgmt. System: Oracle8i/9i/10g, Sybase11/12.5, SQL Server 2005,, Db2, Netezza
DB Tools: TOAD 7.1.7, Smart Term 9.0, MS Access, SQL Developer
Languages: C, C++ (OOPs), HTML, PL/SQL
Editors: Vi, PL/SQL Editor, Toad, Aginity
QUALITY: ITIL Foundation
Defect Management Tools: Remedy, HP Quality Center, Test Director and JIRA.
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Sr. Database Developer/Administrator
Responsibilities:
- Participate in the requirement gathering meeting with clients and define the scope and visibility options for new project.
- Prepare project review documents, test plans and release document changes.
- Performing data profiling, cleansing and resolve the data quality issues.
- Extract data from Legacy sources AS400 and prepare the flat files.
- Develop the ETL programs (using nzload, nzsql) in Netezza to load the data periodically into Data Warehouse.
- Develop requirements traceability matrices linking business and technical requirements to test plans/test cases to ensure full coverage in quality assurance (testing) processes.
- Monitor the long running queries and tune for better performance.
- Complete the projects warranty, postal walk, reporting cube, model score current, site opt changes and Department of Health Services suppression modification.
- Support Netezza NPS with complete responsibility of administration including backup, tuning, maintaining, creating objects with proper distribution, managing SPUs, SPAs and statistics.
- Manage the physical and logical data structures and maintain the metadata.
- Write Shell scripts to automate the load processes that run periodically to load the data.
- Write SQL scripts, functions and Shells scripts to load the data from flat files.
- Develop data flow diagrams and present to the Business Owners and stakeholders.
- Track the requirement changes and estimate the development, testing and implementation efforts.
- Write and execute the SQL statement to validate the data with source.
- Prepare the report mockups and consolidate list of data elements, metrics in MS-excel and generate the charts to present to business.
- Design and Develop Tableau/Micro Strategy reports and dashboards.
- Perform impact analysis and validation of existing production reports and universes after application and database upgrade.
Environment: Netezza 4.5.2, Oracle 10g, SQL, UNIX, IBM DataStage, Cognos, Tableau, IBM InfoSphere Data Architect
Confidential, Lake Forest, CA
Business Intelligence/DW Architect
Responsibilities:
- Setup the meetings and discuss the scope and visibility options for new critical audits.
- Understand Business requirements from Business Managers.
- Business Requirements gathering with various Business Functional Managers/Analysts to understand the meaning of data and usage across enterprise.
- Understand and identify master data across enterprise application to devise strategies for master data management with data cleansing methodologies and recommend importance of data steward ship.
- Data naming and Meta data for entities and data elements in the data model.
- Create data mapping for source and target databases.
- Create data quality strategies for enterprise data, including data profiling and data cleansing.
- Initiate and document data naming standards as a part of data governance at the enterprise.
- Determine standard data naming rules, logical and physical metadata for the data elements.
- Approve ETL design for data movement from the source to target.
- Schedule the workflows and batches to load the data periodically, i.e., daily, weekly and monthly.
- Develop complex transformations using aggregator, router, lookup, sorter and joiner transformations.
- Data profiling and cleansing and resolve the data quality issues.
- Manage the physical and logical data structures and maintain the metadata.
- Write PL/SQL procedures, functions and shells scripts to load the data from flat files.
- Generate plans, tuning the procedures and functions and develop SQL Loader scripts for data load.
Environment: Oracle, DataStage, Db2, Power Embacadero, I-series/AS400, MicroStrategy, SSIS, SSRS
Confidential
Business Intelligence Analyst
Responsibilities:
- Install Hyperion System 9 products in Windows 2003.
- Collaborate in data modeling and database design to convert all existing MARS data into Oracle using Erwin and PL/SQL utilities.
- Preparation of methodologies and System Requirement Specifications document.
- Set up application and configure with Data Source and Shared services.
- As an Essbase Database Administrator, created Users and Groups in Hyperion Essbase and assign privileges accordingly. Apply security on different applications by creating filters on its dimension members.
- Strong knowledge on overall architectural view of Hyperion Interactive Reporting.
- Build Sales Dashboard Solution using Hyperion Interactive Reporting and Dashboard.
Environment: Oracle Hyperion EPM Suite, Hyperion Planning, Essbase, MDX query, Hyperion Interactive Reporting, XCelsius Dashboards
Confidential, Mount Laurel, NJ
Data Analyst
Responsibilities:
- Develop ETL programs to load the data from Legacy Systems into Oracle database using Oracle PL/SQL, SQL Loader, merge, external tables and other utilities.
- Create detailed design document and accommodate changes to improve the existing database.
- Maintain separate user accounts for user community in BO.
- Identifying performance issues and monitor the data loads.
- Go through the entire existing system and identify and eliminate unnecessary feeds from DW and save disk space in production server.
- Reduce redundancy while coding, by implementing the generic package to load small dimension tables.
- Review six ETL tools including External tables, SQL Loader, ACTA, ETL procedure, Non-ETL procedure and Generic Load processes.
- Change Management Process (CMP) and receive approvals from the Tech Lead and Functional Leads.
- Provide database administration, performance tuning support, gathering statistics on regular basis.
- Make changes to Star Schema data model to accommodate the new changes.
- Conduct reverse engineering to extract existing database model to find out proper indexing methods.
Environment: Oracle9i, UNIX, CaliberRM, TIBCO, Tivoli Scheduler, Hyperion Interactive Reporting, Brio Query, Erwin, SQL, PL/SQL, Informatica PowerCenter