We provide IT Staff Augmentation Services!

Data Integration Architect Resume

Knoxville, Tn

SUMMARY

12+ yrs of extensive experience in design, development and implementation of data warehouses as a Data Modeler and Data Warehouse Architect. Strong experience in the entire life cycle of Data Warehouse/Datamart using Data Modeling practices. Strong Knowledge of Data Warehousing concepts (Ralph Kimball and Bill Inmon approach) and Dimensional modeling (Star schema and Snowflake schema).

  • 5+ years Strong Data Modeling experience using Dimensional Data Modeling, Star Schema/Snow flake schema, FACT & Dimensions tables, Physical, logical and conceptual data modeling.
  • Extensive experience with ER (Entity Relationship) Data Modeling, mostly on live and transactional systems with ability to convert complex business ideas in Data Models.
  • Proven experience in Data analysis, Data modeling, Data management, documentation and presentation.
  • Extensive experience as an ETL Developer on Informatica Power Center/Power Mart 7.1/7.3/8.1/9.0 ,9.5,9.6 and experience as an ETL Designer.
  • Developed in databases such as Oracle 8.x/9.x/10g, my Sql (mysql), postgre SQL(PostgreSQL) 8.x,9.x, Sybase 11.x/12.x,ASE, IQ, MS SQL SERVER 2000/2005/2008 , MS Access, Teradata, SQL*Plus, SQL* Loader using Toad, Sql Developer, DBArtisan, Rapid SQL and pg admin 3.
  • Good Experience of back end programming using PL/SQL, Triggers, Stored procedures, database table design, indexes, performance tuning in Oracle.
  • Proficient in coding UNIX shell scripts.
  • Performance tuning for mapping and session and Oracle PL/SQL blocks.
  • Hands on experience of Business Objects. 5.0.1, Power Analyzer 4.0.
  • Used Application packages such as Visual Source Safe 6.0 and MS Office 2000.
  • Presentable documentation, communication skills, problem solving, and analytical capabilities.
TECHNICAL SKILLS
  • Data Modeling: Power Designer 15.3/16.0, Erwin 7.x, ER Studio 9.0
  • ETL: Informatica 9.0/ 8.6/8.1/7.1/ 6.1/5.1
  • DBMS/RDBMS: Oracle 8i/9I/10g,Teradata 5.1,
  • SQL Server 2000/2005/2008 , mysql, posgreSQL 9.0
  • Sybase ASE, IQ, Netezza TwinFin 4.5, 5.0, 6.0
  • Programming Language: SQL, PL/SQL, Python 2.x
  • OLAP/Reporting Tool: Cognos 6.x,Business Object 5.1, Crystal Report,Power Analyzer
  • Operating System:Windows Family, Unix,Linux
PROFESSIONAL PROFILE

Confidential Knoxville, TN

Project Name : Confidential

Role: Data Modeler and Data Integration Architect

Platform: ER Studio Data Architect 9.0, Informatica 9.5, MySQL, PostgreSQL 9.0, Oracle 10g, SQL, PL/SQL Developer, SQL*PLUS, Toad, Shell Scripting

Role and Responsibilities:

  • As Data modeler, my primary role was to overview the Design of all the Data Models in the end to end Delivery Process. The tool of Choice is ER Studio Data Architect 9.0
  • Created Conceptual, Logical and Physical Models for various applications and MAM Data Warehouse. The Physical models consisted of Various DB’s like Oracle, MySQL and PostgreSQL
  • Use forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform Data Profiling on various source systems to ensure the consistency of data.
  • Worked extensively in loading the Dimension and facts. Handled change data capturing for dimensions.
  • Create Data Dictionaries; perform metadata management for various Data Marts.
  • Created Wiki Documents for Requirements and maintained Data Models on Scripps Wiki pages.
  • Created ETL maps using Web Service, JSON and Complex XML feeds.

Confidential OMAHA, NE

Project Name : Confidential

Role: Data Modeler and Data Warehouse Architect

Platform: Power Designer 16.0, Informatica 9.0, Oracle 11g, Netezza Twin fin 5.0/6.0, SQL, PL/SQL Developer, SQL*PLUS, Win SQL, Shell Scripting

Role and Responsibilities:

  • Data modeling of the Salesforce CRM system to create and enhance the physical and logical data model design using Power Designer 15.x and now 16.x
  • Use forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform proof of concept analysis to assess the feasibility of projects.
  • Create Requirement Documents as per Business specifications.
  • Analyze data inaccuracies within databases and recommend process improvements or system changes to enhance overall quality of the data.
  • Perform various Data Cleansing methods like: Missing Records or Attributes, Redundant Records, Missing Keys or other required data, erroneous records and inaccurate data.
  • Perform Data Profiling on various source systems to ensure the consistency of data.
  • Develop, document, enhance and enforce data quality standards.
  • Create Data Dictionaries; perform metadata management for various Data Marts.
  • Developed ETL Designs to Load the Salesforce Inbound and Outbound (EDW) Processes.
  • Worked extensively in loading the Dimension and facts. Handled change data capturing for dimensions.
  • Created mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, mapplets etc.
  • Used parameter files extensively while loading the data, used external loaders like SQL Loader informatica external load etc.
  • Designing the ETLs in Informatica 9 to Load the ODS and DWH (star schema) using standard Kimball methodology.
  • Created utility programs for file manipulations, pre - session, post-session using shell scripts and also used Autosys for scheduling workflows.
  • Involved in ETL testing, created Unit test plan and Integration test plan to test the mappings and created the test data.
  • Worked extensively in Netezza, Oracle, SQL*PLUS, Query performance tuning, created DDL scripts, created database objects like Tables, Indexes, Synonyms, Sequences, stored procedures etc. Closely worked with DBAs to improve SQL performance and involved in Data Import/Export.

Confidential, Horsham, PA

Project Name: Confidential

Role: Data Modeler and Data Warehouse Architect

Platform: ERWIN 7.x, Informatica 8.6,Oracle 11g, SQL Server 2005, DB2, SQL, PL/SQL, SQL*PLUS, SQL Navigator, Toad 9.0, Shell Scripting

Responsibilities

  • Developed logical and physical data models using Erwin Data modeler 7.x.
  • Used forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform proof of concept analysis to assess the feasibility of projects.
  • Create Requirement Documents as per Business specifications.
  • Analyze data inaccuracies within databases and recommend process improvements or system changes to enhance overall quality of the data.
  • Perform various Data Cleansing methods like: Missing Records or Attributes, Redundant Records, Missing Keys or other required data, erroneous records and inaccurate data.
  • Redesigning the exisiting facts and dimensions for employee performance reporting. This is to achieve better performance by performing the metrics calculation through ETLs rather than using Siebel Analytics.
  • Worked as an ETL Architect using informatica 8.6
  • Developed ETL design to load various metric loads in the employee performance fact. Performed ETL tuning to improve the performance of the loads.
  • Designing the ETLs in informatica 8.6 to Load the ODS and DWH (star schema) using standard Kimball methodology.
  • Performed ETL Unit testing, System testing, Integration testing to validate the data as per the Business logic.
  • Created mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, mapplets etc.
  • Used parameter files extensively while loading the data, used external loaders like SQL Loader informatica external load etc.

Hire Now