We provide IT Staff Augmentation Services!

Datawarehouse Lead Resume

North Andover, MA

SUMMARY:

  • DW - BI professional with Seven years IT Experience in Data Warehousing as Data Analyst and as a Data Warehouse Developer
  • Experience leading off-shore team as an ETL lead for Informatica Cloud Services implementation
  • Core Technical skills set include SQL, PL-SQL, PLPG-SQL programming and Performance Tuning
  • Good command over SQL queries, Complex SQL, SQL Plus, PL/SQL Packages, Stored Procedures, Functions and Performance Analysis, Indexing, Creating Partitions, aggregating tables and other complex tasks
  • Experience on databases like Oracle 10g/11g, Teradata, Greenplum, Netezza, and SQL Server
  • Involved in various projects related to System/Data Analysis, Data Profiling, Design and Development of key structures and Data Modeling of both OLAP and OLTP Data warehousing environments
  • Very good knowledge of database like Oracle, Teradata, Greenplum, SQL Server, Netezza, Postgres
  • Involved in multiple end to end Data Warehouse implementations
  • Experience on Dimensional modeling, OLAP/OLTP Models
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema, Snowflake Schema, Fact and Dimension tables
  • Good knowledge of implementing data model using data warehousing concepts
  • Good knowledge of creation of Conceptual, Logical and physical models
  • Implemented Slowly Changing Dimensions - Type I & II in Dimensional tables
  • Experience in the work environment consisting of Business analysts, Production Support teams, Subject Matter Experts, Database Administrators and ETL/Database developers
  • Good understanding of Views, Synonyms, Indexes, Joins and Partitioning
  • Involved in Administration, Development, Migration, Major/Minor Upgrade, Database Patching, Backup/Restore, Maintenance activities
  • Experience on Onsite - Offshore delivery model
  • In-depth knowledge of data mapping, warehousing tools, unit testing, process documentation, software development life cycle and data mapping
  • Provided 24*7 production support which includes handling of L1, L2, L3 requests/incidents
  • Outstanding leadership abilities, able to coordinate and direct all stages of software development while managing, motivating and leading project teams
  • Excellent problem-resolving skills with strong technical background and good interpersonal skills. Quick learner and excellent team player, ability to meet deadlines and work under pressure
  • Documentation for Naming standards, best practices, run book etc

TECHNICAL SKILLS:

Programming Languages: C, C++, SQL, PLSQL, PL/PG SQL, UNIX

Databases: Oracle 9i/10g/11g, Teradata 16.10, MS SQL Server 2016/2012/2008 R2/2008/2005, Green plum 4.2/4.3, Netezza 7, Postgres 8.2/8.3, Azure SQL Datawarehouse

Applications: Erwin Data Modeler R9.64, PG Admin 3, Aginity, Sybase Power Designer, Toad, SQL / PLSQL Developer, Informatica, Informatica Cloud Services, SAP BO, Power BI, Tableau

Domains: Telecom, Banking, Insurance, Manufacturing

PROFESSIONAL EXPERIENCE:

Confidential, North Andover, MA

Datawarehouse Lead

Responsibilities:

  • Application of best practices of data warehouse ETL design and scaling the implementation suitably to an organization the scale of Confidential
  • Design and develop ETL work packages on Informatica Cloud Services, and migrate to Azure SQL Datawarehouse
  • Guide off-shore team with functional ETL specifications and implement business logic and schema design where appropriate, as well as manage the team’s time effectively with methodical and structured approach to design and development
  • Translate functional specifications into ETL design specifications supported by detailed technical specifications for the support of the Data Warehouse development on an ongoing basis
  • Managing delivery of various ETL and Datawarehouse requirements within a small, multi-disciplinary team
  • Designing and developing ETL processes for the Confidential data warehouse; from the acquisition and staging of data, storing records within the warehouses historical archive and transforming records for loading into the Azure SQL warehouse using Informatica Cloud Services
  • Prepare key documentation to support the technical design in technical specifications
  • Lead ETL development and design whilst mentoring the projects other ETL developer in best practice and design

Confidential, Saint Paul, MN

Data Analyst and Data Modeler

Responsibilities:

  • Solely responsible for delivering SME inputs to leadership to choose appropriate tools and technologies to meet all future high-quality requirements
  • Interacting with client for requirements gathering and developing new views in Teradata for BI reports for the GPIM (Global Product Information Management) users.
  • Responsible for data analysis from source to target for new development
  • Developed complex SQLs and generic stored procedures for auto trigger of process status mail
  • Participate in Data Analysis and Data Dictionary and Metadata Management -Collaborating with Business Analysts, SMEs, ETL Developers, Data Quality Analysts and Database Administrators for design and implementation of Logical Data Model.
  • Design Fact Tables and Dimension Tables for Data Mart to support all the business requirements using Erwin Data Modeler tool.
  • Assisted in generating Surrogate ID’s for the Dimensions in the Fact Table for indexed and faster access of data.
  • Managing Logical Data Model (LDM) & Physical Data Model and assist the DBA to create the physical database by providing the DDL scripts including Indexes, Data Quality checks scripts and Base View scripts, keeping adherence to database optimization.
  • Responsible for providing stable environment of data for GPIM users all around the globe who use reports for gap analysis and BI Analytics.
  • Primary contact for entire PISA (Product Information Subject Area) and GPIM team and coordinates with other teams for issues thereby ensuring smooth functionality.
  • Interact with vendors on issues/requests raised by business users and get them clarified.
  • Collect the information about different Entities and Attributes from source system.
  • Maintain and create data models in Erwin data modeling tool.
  • Utilize data modeling phase methodology leveraging conceptual, logical, and physical development to arrive at model solutions.
  • Convert ERD (Entity-Relationship Diagram) Model into Relational Model for data warehousing.
  • Perform Teradata (Enterprise Data Warehouse) validation and analysis to identify the accurate data source and to track data quality to detect/correct the invalid data.
  • Analyze the performance of existing jobs and guiding developers how to tune them.
  • Developing reusable database code for automating few database processes. Involved in automation of status mail to reduce manual efforts.
  • To conduct the review with client for all the deliverables. Understanding client’s requirements defining cross validation and business rules. Overall requirement strategy definition and finalizing the methodologies and techniques for requirements gathering.

Environment: Erwin Data Modeler R9.64, Teradata 15.10.06.02 , Oracle 11g, SQL Server 2012/11.0.6567 , Hybris, SAP BO/BI reporting

Confidential

Data modeler & Administrator

Responsibilities:

  • Direct interaction with business end users for analyzing the business need/requirement and Participate in requirements gathering, project estimation processes.
  • Solely responsible for delivering SME inputs to leadership to choose appropriate tools and technologies to meet all future high-quality requirements.
  • Collect the information about different Entities and Attributes from source system.
  • Maintain and create data models in Erwin data modeling tool.
  • Utilize data modeling phase methodology leveraging conceptual, logical, and physical development to arrive at model solutions.
  • Review /create architecture documents and diagrams: Data models, Data mapping, Data dictionaries, Metadata. Documented various requirements based on the changes that occur day to day.
  • Participate in Data Analysis and Data Dictionary and Metadata Management -Collaborating with Business Analysts, SMEs, ETL Developers, Data Quality Analysts and Database Administrators for design and implementation of Logical Data Model.
  • Developed the Logical Data Model (LDM) & Physical Data Model and assisted the DBA to create the physical database by providing the DDL scripts including Indexes, Data Quality checks scripts and Base View scripts, keeping adherence to database optimization.
  • Designed Fact Tables and Dimension Tables for Data Mart to support all the business requirements using Erwin Data Modeler tool.
  • Assisted in generating Surrogate ID’s for the Dimensions in the Fact Table for indexed and faster access of data.

Environment: Erwin Data Modeler R9.64, Aginity, PL SQL Developer, Greenplum 4.3, Oracle 10g

Confidential

Data Analyst and Data Warehouse Developer

Responsibilities:

  • Defining of database naming standards and documentation.
  • Finalize appropriate data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Ensure the ease of accessing history of reference data changes by implementing Slowly Changing Dimensions Type 1 & Type 2 of reference data.
  • Design and create data model from Greenplum. Create and run the DDLs
  • Implementation of SCD logic on tables as per data analysis
  • Developing PLSQL/ PlpgSQL functions procedures
  • Developed a star schema data warehouse model for the proposed central model.
  • Perform Oracle/Greenplum (Enterprise Data Warehouse) validation and analysis to identify the accurate data source and to track data quality to detect/correct the invalid data.
  • Creation of DB objects as per standard and providing required access.
  • Implementing Database releases and upgrades
  • Managing database migration request and backups/logs
  • Create and maintain database restoration processes and conduct restoration procedures when needed
  • Develop and update the data technology architecture
  • Also developed various automated functions, scripts that sped up the regular processes like database segment status trigger, DDL generation, data migration and grating permissions etc.
  • Greenplum database development included writing generalized function for slowly changing dimension type one & two, rapidly changing dimensions, Audit balancing, Data quality check and other for various track needs.
  • Oracle database development included generic script for DDL & Data migration from oracle to Greenplum and vice versa.

Environment: Sybase Power Designer, Greenplum PG Admin 3, UNIX, Informatica 9.5.1, SAP BO, Power BI, SDL, Greenplum 4.3 Postgres Version 8.2, Oracle 10g

Confidential

Data Analyst/Data Warehouse Developer

Responsibilities:

  • Involved in gathering user requirements along with the Business Analyst.
  • Understood the company standards and developed the data models accordingly
  • Developed SQL queries to verify that data has been moved from transactional system to DSS.
  • Worked on data warehouse, data mart reporting system in accordance with requirement.
  • Created of DB objects as per the standards and providing proper access.
  • Developed UNIX scripts for implementing batch wise file movement process.
  • Monitored the performance of database due to code developed and troubleshoot the same in case of any issue.
  • Migrated of the data/code in different environments like DEV/QA/PRODUCTION

Environment: Greenplum-PG Admin 3, Postgres 8.3, Oracle 10g-Toad, UNIX, SAP BO, Power BI, DataStage

Confidential

Data Analyst/Data Warehouse Developer

Responsibilities:

  • Discussion with business end user for gathering more information related to business need.
  • Analyzing the available database information in the current system and need to retrieve from other sources.
  • Finalize the new structure required for implementing correct solution as per the business requirement.
  • Develop the SQL code and stored procedures for achieving correct output.
  • Creation of DB objects as per the standards and providing proper access.
  • Writing UNIX scripts for implementing batch wise file movement process.
  • Monitoring the performance of database due to code developed and troubleshooting of the same in case of any issue.
  • Migration of the code in all the environment like DEV/QA/PRODUCTION

Environment: Greenplum-PG Admin 3, Postgres 8.2/8.3,Oracle 10g, UNIX, SAP BO, Power BI, DataStagess

Confidential

Data Analyst/Data Warehouse Developer

Responsibilities:

  • Interact with business users for analyzing actual need of requirement.
  • Come up with data model for the requirement with guidance if senior data modeler
  • Developing the DB Model in such way that requirement will gets fulfilled with additional checks and info.
  • Develop SQL and model physically in the system. And perform quality testing of output result.
  • Performance tuning for developed code and testing of the same.
  • Analysis of data and provide data dumps to business user for Adhoc requirement.

Environment: Greenplum-PG Admin 3, Oracle 10g, UNIX, SAP BO, Power BI, DataStage

Confidential

Data Analyst/Data Warehouse Developer

Responsibilities:

  • Analyzing data for business requirement and writing SQL scripts to achieve correct result.
  • Develop the SQL code, Stored Procedures in such way that requirement will gets fulfilled with additional checks and info.
  • Performance structure analyzation for developed model and testing of it in DEV environment.
  • Migration of SQL code and data from one environment to another as per need.
  • Providing data for Adhoc request by business users.

Environment: Greenplum-PG Admin 3, Oracle 10g, UNIX, SAP BO, Power BI, DataStage

Hire Now