We provide IT Staff Augmentation Services!

Dw/netezza/bi Consultant Resume

5.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Over 5+ years of experience in Netezza database in all phases of Analysis, Design, Development, Implementation and Administration.
  • Over 9+ years of extensive experience in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Netezza, SQL, Data stage, PL/SQL, Oracle, Informatica Power Center 9.X/8.X/7.X/6.X/5.X, and Unix.
  • Hands - on experience on NZsql, NZload, NZmigrate and NZbackup
  • Extensive experience in business and technical aspects of Data Warehousing and big data.
  • Experience in full life cycle implementation of Data Warehouses.
  • Experience in migrating data from DB2 to Netezza using NZload and ELT mappings (Datastage) in Data Integration layer.
  • Strong experience in Netezza, Informatica and DataStage ETL/ELT tools.
  • Experience in UNIX Shell Scripting (Korn & Bash) on UNIX & Linux platform. worked on design and development of an Operational Control module, which integrates ETL and ELT processing in a Healthcare domain
  • Good experience in Autosys.
  • Automation of various manual process used by production support team which helped in saving time and resource utilization.
  • Experience in using versioning tools- SVN and GIT.
  • Created scripts which will send notifications/warning emails which helped in smooth running of daily batch jobs.
  • Expertise in using Oracle 8i/9i/10g/11g and had hands on experience in building advanced relational database objects which uses Oracle 8i/9i/10g database.
  • Involved in all the phases Analysis and Design, Data Modeling - ER Diagrams and Normalization, System Testing, Development, documentation and implementation.
  • Strong experience in Extraction, Transformation and Loading of data from multiple sources into Data Warehouse and administrating the Informatica PowerCenter environment such as maintaining Informatica Repository and Server processes.
  • Experience in Migrating the Data from Oracle to Netezza Database.
  • Troubleshooting and problem solving applications with Informatica Debugger.
  • Validating,TESTING & Debugging ERRORS with in ETL and ELT Transformations.
  • Experienced in creating entity relational & dimensional relational data models using Kimball/Inmon Methodology i.e. Star Schema and Snowflake Schema.
  • Experience in Slowly Changing Dimensions (Type 1, Type 2 and Type 3), OLTP, OLAP, Data Reports and Surrogate keys, Normalization/De normalization.
  • Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
  • Experience in all phases of the project life cycle and production support.
  • Experience in Informatica Power Exchange for Netezza.
  • Expertise in fine performance tuning of SQL statements and PL/SQL modules using Explain Plan, SQL Trace for optimizing the cost and evaluating the performance.
  • Experience in translating business requirements into creation of database objects: tables, indexes, constraints, packages, stored procedures, functions, and triggers using Oracle PL/SQL tool.
  • Extensive experience in analysis and design of database, database modeling, ER Diagrams, normalization and de-normalization.

TECHNICAL SKILLS

ETL: Informatica 7.x/8.x/9.x, SQL Server 2005 (SSIS), Datastage 9.2

Scheduling Tool: Autosys, DAC (Data Warehousing Admin Console)

Reporting Tool: Oracle Business Intelligence (OBIEE) 10.1.3X, SiebelAnalytics, Cognos 10.1

Databases: Oracle11g/10g/9i/8i, MS SQL Server 2005/2000, Netezza 4.5.1, 6.0.3, 6.0.5, 7.0

Operating Systems: Windows 95/98/2000/2003/7 Server/NT Server Workstation 4.0, UNIX

Programming: Visual Basic 6.0/5.0, C, SQL, PL/SQL

Other Tools: SQL*Loader, SQL*Plus, TOAD, Eclipse, MS Visio, MS Excel, Eclipse Platform

Scripting Languages: UNIX Shell Scripting - bash, Korn

Methodologies: E-R Modeling, Star Schema, Snowflake Schema, Agile

Data Modeling Tool: Erwin 3.5/4.1

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

DW/Netezza/BI Consultant

Responsibilities:

  • Design, develop and provide unit test data loading and data transformation programs related to Netezza Data Warehouses and Data Marts
  • Worked closely with Sr. Architect on mapping document and requirement gathering.
  • Design and develop data loading programs to support real time data loading to Netezza Data Marts
  • Worked on Agile methodology
  • Create databases, users and groups for Lower and upper level Environments
  • Worked loading transaction data from the source to the Dimensional Data Marts
  • Worked on Multi-tenancy architecture
  • Extensive noledge of Bash and Korn scripting language
  • Worked on ETL(Data stage) jobs from Source to Target
  • Knowledge of Data stage (Administrator client, Designer client and Director client)
  • Design, develop and provide unit test data loading and data transformation programs related to Netezza Data Warehouses and Data Marts
  • Proficient with UNIX/LINUX commands
  • Proficient with the SQl development language
  • Performed unit testing, integration testing, regression testing and system testing
  • Resolved issues during testing phase
  • Performance tuned the transformations using ASYNC and Finalization
  • Maintained system documentation of best database procedures, practices and standards

Environment: Netezza 7, nzDIF 3.9, UNIX Shell Scripting, NZSQL, sql, Netezza TwinFin 24, Aginity Work Bench, Datastage 9.2.

Confidential, St. Petersburg, FL

Netezza Datawarehouse DBA

Responsibilities:

  • Prepared strategic plans for data warehousing projects and related quality documentation
  • Assisted in designing and implementation of detailed data warehouse models and mappings
  • Provided technical expertise to technologies in fields of data warehouse and relevant analytics
  • Performed establishment and maintenance of databases for assigned projects as per business requirements
  • Implemented procedures for daily database monitoring, backup and configuration changes
  • Developed queries, backups, and session management and log management records in detailed manner
  • Executed processes for management of migration of tables, schemas, database as per defined and assigned locations.
  • Working in Agile methodology
  • Create databases, users and groups for Lower and upper level Environments
  • Proficient with UNIX/LINUX commands
  • Extensive noledge of Bash and Korn scripting language
  • Extensive noledge on scheduling (Autosys)
  • Suggested enhancement methods, rollout and upgrades for maintenance of existing databases
  • Maintained system documentation of best database procedures, practices and standards
  • Resolved database problems and issues by troubleshooting and responding to service requests

Environment: Netezza 7.0.4.X, Netezza 6.0.X.X, UNIX Shell Scripting, NZsql, Nzload, NZmigrate Netezza TwinFin 24, Netezza Mustang, Netezza N2001, Aginity Work Bench, WINSQL.

Confidential, Charlotte, NC

DW/Netezza/BI Consultant

Responsibilities:

  • Worked on mapping document and gathering requirements
  • Design and develop data loading programs to support real time data loading to Netezza Data Marts
  • Worked on ETL(Data stage) jobs from Source to Target
  • Worked on DB2 and netezza migration using NZload and data stage mappings in DI layer.
  • Worked on thin slicing in Agile methodology
  • Create databases, users and groups for Lower and upper level Environments
  • Worked loading claims data from the source(Informed) to the Dimensional Data Marts
  • Worked on Multi-tenancy architecture
  • Extensive noledge of Bash and Korn scripting language
  • Knowledge of Datastage (Administrator client, Designer client and Director client)
  • Design, develop and provide unit test data loading and data transformation programs related to Netezza Data Warehouses and Data Marts
  • Proficient with UNIX/LINUX commands
  • Proficient with the SQl development language
  • Worked on creating reports in Cognos
  • Extensive noledge on scheduling (Autosys)

Environment: Netezza 7, UNIX Shell Scripting, NZSQL, sql, Netezza TwinFin 24, Aginity Work Bench, Datastage 9.2, Cognos 10.1.

Confidential

Responsibilities:

  • Design, develop and provide unit test data loading and data transformation programs related to Netezza Data Warehouses and Data Marts
  • Design Trickle feed data loading programs to support near real time data loading of Netezza Data Marts
  • Designs data transformations and exceptions processing in addition to refresh and replay data loading capabilities
  • Development coding and unit tests the data loading and transformation programs
  • Worked on DB2 and netezza migration using NZload and datastage mappings in DI layer.
  • Codes orchestration programs dat schedule the data loading and data transformations routines
  • Unit tests all sata loading and transformation programs
  • Extensive noledge of the nzDIF Data Integration Framework
  • Extensive noledge of Bash shell scripting language
  • Proficient with UNIX/LINUX commands
  • Proficient with the SQl development language
  • Extensive noledge on scheduling (Autosys)

Environment: Netezza 6.x & 7, UNIX Shell Scripting, NZSQL, sql, Netezza TwinFin 24, Brightlight Framework 3.8 & 3.9, Aginity Work Bench, AQT, Silktest tool, Datastage 9.2

Confidential

Responsibilities:

  • Worked extensively on Brightlight Framework 3.5, 3.8.
  • Used different Framework assets like nzf, app init.
  • Worked on Application asserts like setup cfg.sh, run, log, ddl, meta, data, auto.
  • Involved in admin workflows like db create, db init.
  • Involved in dm workflows like ddl convert, ora ddl convert and db install.
  • Involved in dev workflows like db intake config, db xfr templates, db referential config.
  • Worked extensively on Aginity Work Bench and AQT.
  • Written transformations using shell scripting mainly bash.
  • Performed unit testing, integration testing, regression testing and system testing.
  • Resolved issues during testing phase.
  • Performance tuned the transformations using ASYNC and Finalization.
  • Created list files, flow files and shell scripts (.sh).
  • Created data files for test cases using Silktest tool and creating manifest files.
  • Performed load test on transformations by using generated test data.
  • Worked on Cognos views by using the table sql.
  • Worked on the mapping documents.
  • Assisted QA in testing the QA test cases and resolving the issues.
  • Worked on updating the data model changes.
  • Worked on the triggers and RDM joins.
  • Wrote DIFF script for data model changes to find the difference between two scripts.
  • Worked in Agile Environment.

Environment: Netezza 6.0.5, UNIX Shell Scripting, MS Excel, NZSQL, Netezza TwinFin 24, Brightlight Framework 3.5, 3.8, NZF, Aginity Work Bench, AQT, Silktest tool, triggers and RDM joins.

Confidential

Responsibilities:

  • Worked extensively on SQL, PL/SQL, Perl and UNIX shell scripting.
  • Analyzed the source system of ETL Maps.
  • Developed the ETL routines.
  • Designed and developed the fact/dimension entities.
  • Involved in the Unit testing, Event & Thread testing and System testing of the individual.
  • Written Unix Shell Scripting based on requirements
  • Worked in Agile environment
  • Migration of data from Oracle to Netezza.
  • Worked on Brightlight Framework 3.0.
  • Worked extensively in NZSQL to migrate the data from Oracle to Netezza database.
  • Written Unix Shell Scripting based on requirements.
  • Performed unit testing of the shell scripts developed with various scenarios.
  • Performed integration testing and system testing.

Environment: Netezza, UNIX Shell Scripting, MS Excel, NZSQL, Netezza TwinFin 12, Brightlight Framework 3.0.

Confidential, Detroit, MI

Sr.Informatica Developer

Responsibilities:

  • Analyzed the existing claims process and specific business rule logic.
  • Gather detailed business and technical requirements and participate in the definitions of business rules and data standards.
  • Created Data Stage jobs to extract, transform and load data into data warehouses from various sources like relational databases, application systems, temp tables, flat files etc.
  • Involved in system study, analyses of the requirements and designing of the complete system.
  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Designed and developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
  • Used Informatica features to implement Type I changes in slowly changing dimension tables.
  • Worked on Unstructured data mainly on word documents.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Created Netezza database, users, base table and views using proper distribution key structure.
  • Used Informatica Power Connect for Netezza to pull data from Netezza data warehouse.
  • Developed mapping parameters and variables to support connection for the target database as Netezza and source database as Oracle OLTP database.
  • Created Complex mappings using Connected/Unconnected Lookup, Aggregator and Router transformations for populating target table in efficient manner.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Involved in writing various PL/SQL stored procedures, functions and packages.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Created events and tasks in the work flows using workflow manager
  • Created Schema objects like Indexes, Views, and Sequences.
  • Created mappings using the Transformations such as the Source qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator, Update Strategy etc.
  • Extracting the data from .BRD files, Flat files & Orcale and load them through Informatica.
  • Used Informatica Workflow Manager and server to read data from sources to write to target databases, manage sessions/tasks and to monitor server ETL Implemented for Oracle and SQL Server.
  • Worked with DAC for job scheduling.
  • Production Support and issue resolutions.
  • Created test cases and completed unit, integration and system tests for sales data mart system.
  • Involved in writing procedures, functions in PL/SQL.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Written Unix Shell Scripting based on requirements

Environment: Informatica Power Center 9.1, Informatica Power Center 8.6.1, Informatica Power Exchange, Informatica Power Connect for Netezza, Netezza 4.5.2, Oracle 11i/10g, SQL Server 2005, UNIX, Shell Scripting, DAC, Toad, MS Excel.

Confidential, Atlanta, GA

Informatica Developer/Analyst

Responsibilities:

  • Involved with Business Analysis team in collecting the business requirements based on user specifications and transformed them into ETL Specs.
  • Created Design/Technical documents to handle various process and the necessary dependencies so dat the developers can refer it as a design document.
  • Have worked on ODS/Staging environment in Data ware house projects.
  • Fine Tuned the SQL queries using hints for maximum efficiency and performance.
  • Interacting with the users and troubleshooting the problems involved with the development of stored procedures, triggers and with the privileges.
  • Extracted data from various sources like Flat files, Oracle and loaded it into Target systems Netezza 4.5.1 using Informatica Power Center 8.6.1
  • Created Netezza database, users, base table and views using proper distribution key structure.
  • Used Informatica Power Exchange for Netezza to pull data from Netezza data warehouse.
  • Developed mapping parameters and variables to support connection for the target database as Netezza and source database as Oracle OLTP database.
  • Partitioned session for concurrent loading of data in to the target tables.
  • Extensively worked with stored procedure, Expression, Filter, Joiner, Source Qualifier, Look up, Router, Update strategy and Aggregator transformations.
  • Analyzed performance bottlenecks, tuned various mappings and created reusable transformations to increase the performance.
  • Have discussed/finalized standards for various ETL coding.
  • Designed and developed complex mappings using various transformations in Designer to load data from Teradata, Flat Files, SQL Server, Oracle 9i into the target Data warehouse
  • Developed several Informatica mappings for Extraction/Transformation, data cleansing and loading of data into staging areas and data marts.
  • Developed pre-session, post-session routines to drop indexes and to reassign them and batch execution routines to run Informatica sessions.
  • Created various tasks, arranging related tasks in different worklets, arranged different worklets in workflow depending upon their interdependencies in workflow manager.
  • Developed parameter file templates and used parameters, variables to make mappings more flexible, also developed shell scripts to automate the process.
  • Automated sending Car data files to the third party users through FTP
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Used ETL Metadata tables to verify performance of the various sessions used in the workflow and fine tune for the long running sessions.
  • Closely worked with the production support team and data team to recover from issues and various defects dat encounter while running the jobs and analyzing the data.
  • Worked with DAC for job scheduling.

Environment: Informatica Power Center 8.6.1, Netezza 4.5.1, Informatica Power Exchange, Oracle 11i/10g, ERWIN, SQL Server 2005, Teradata, XML files, UNIX, Shell Scripting, DAC, MS Excel.

Confidential, Irving, TX

Informatica Developer/Analyst

Responsibilities:

  • Working with Business analysts to understand business/system requirements in order to transform business requirements into TEMPeffective technology solutions.
  • Worked on Informatica tools like Designer, Workflow Manager, and Workflow Monitor.
  • Working mostly on Lookup, Aggregator, Router and Expression Transformations to implement complex logics while coding a Mapping.
  • Monitoring the Jobs, observing the performance of the individual transformations and tuning the code.
  • Creating Indexes on tables for most retrieved fields of tables and views.
  • Follow up with appropriate documents for the Change Management, Incident Management, and Problem Management.
  • Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Integrated Informatica Data Quality with Informatica Power Center.
  • Installed IDQ on a informatica power center server machine.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment (with a paging system).
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating shortcuts, re-usable Mapplets and Transformation objects.
  • Implemented slowly changing dimensions Type 1, and Type 2 methodology for accessing the full history of accounts and transaction information.
  • Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.
  • Provided production support by monitoring the processes running daily.
  • Created, optimized, reviewed, and executed SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Taken part of Informaticaadministration. Migrated development mappings as well as hot fixes them in production environment.
  • Used Informatica scheduling for defining, scheduling, monitoring the jobs.
  • Involved in analyzing the year end process and managed the loads dat have yearend aggregations.
  • Was involved in Informatica Power Exchange to import SQL Server Source.
  • Was involved in Informatica Power Exchange to test the data at mainframe .dat files.
  • Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
  • Created Workflows with worklets, event wait, decision box, email and command tasks using Workflow Manager and monitored them in Workflow Monitor.
  • Creating stored procedures, sequences and triggers to insert key into the database table.
  • Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
  • Used the command line program pmcmd to run Informatica jobs from command line. And used these commands in shell scripts to create, schedule and control workflows, tasks and sessions.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
  • Created documents for data flow and ETL process using Informatica mappings to support the project once it implemented in production.
  • Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.

Environment: Informatica Power center 8.6.1, Informatica Power Exchange 8.6.1, Oracle 10g/11g, SQL Server 2008, Toad for Oracle,DB2, Flat Files, WinNT 4.0, UNIX, Linux, AIX and Business Objects 6.5, MS Excel.

Confidential, Houston, TX

Informatica Developer

Responsibilities:

  • Coordinated/Managed the development and project implementation and production support activities.
  • Participated in business analysis, ETL requirements gathering, physical and logical data modeling and documentation.
  • Designing the data transformation mappings and data quality verification programs using Informatica and PL/SQL.
  • Extracted source definitions from various databases like Oracle, Sybase, MS SQL Server, MS Access and Flat-files into Informatica repository.
  • Developed and Tested Mappings using Informatica Power Center Designer.
  • Designed Reusable Transformations and Mapplets. Used most of the Transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Rank, Expressions, Aggregator, Filter, and Sequence Generator for loading the data into Oracle database.
  • Designed Workflows, reusable Tasks and Worklets using Informatica Workflow Manager SQL and Database tuning.
  • Investigate, debug and fix problems with Informatica Mappings and Workflows, BO reports
  • Participated in Decision support team to analyze the user requirements and to translate them to technical team for new and change requests.
  • Performed unit and integration testing in User Acceptance Test (UAT), Operational Acceptance Test (OAT), Production Support Environment (PSE) and Production environments.
  • Technical Documentation.

Environment: Informatica Power Center 8.1, Oracle 9i, MS SQL Server 2000, PL/SQL, and MS Access.

Confidential, OK

Informatica Developer

Responsibilities:

  • Involved in gathering the business requirements and molding them in to the technical specifications required for the conversions team.
  • Designed ETL mapping based on Existing ETL logic.
  • Worked with various transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Extensively used ETL to load data from flat files (excel/access) to Oracle.
  • Load balancing of ETL processes, database performance tuning and Capacity monitoring.
  • Made adjustments in Data Model and SQL scripts to create and alter tables.
  • Worked extensively on SQL, PL/SQL, Perl and UNIX shell scripting.
  • Analyzed the source system of ETL Maps.
  • Developed the ETL routines.
  • Designed and developed the fact/dimension entities.
  • Involved in the Unit testing, Event & Thread testing and System testing of the individual
  • Co-ordination with Client Business & Systems team for QA.
  • Performed the tuning of ETL SQLs.

Environment: Informatica Power Center 7.1.2, Oracle9i, SQL, PL/SQL, SQL*PLUS, SQL Server, UNIX, Windows NT, and MS Excel.

Confidential, MN

Informatica Developer

Responsibilities:

  • Worked with Data Modeler to understand the Business Requirements and Business Rules.
  • Prepared ETL Specifications to help develop mapping.
  • Extracted data from various sources (Flat files, SQL Server, Teradata, DB2 Databases).
  • Designed and developed Informatica Mappings and sessions based on business user requirements and business rules.
  • Worked extensively on Source Analyzer, Mapping Designer, and Warehouse Designer
  • Developed several Mappings using corresponding Sources, Targets and Transformations.
  • Creating source and target table definitions using Informatica Designer. Source data was extracted from Flat files, SQL Server Databases
  • Analyzing the source data and deciding on appropriate extraction, transformation and loading strategy.
  • Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
  • Created mappings to read from Flat files, RDBMS and to load into RDBMS tables.
  • Based on business Rules Debugged code, tested and validated data after processes are run in development.
  • Used Job scheduling software for Batch Scheduling.
  • Modified the existing mappings for the updated logic and better performance
  • Assisted QA Team to fix and find solutions for the production issues.
  • Used UNIX Shell scripts in coding ETL Tasks.

Environment: Informatica 7.1.2 (power center), SqlServer2000, DB2 UDB V8.2, Teradata, Ms-Word, Excel.

Confidential

Intern/ Trainee

Responsibilities:

  • Developed SQL queries to check the database.
  • Creation of Triggers, Stored Procedures, Tables, Indexes, Rules, Defaults etc.
  • Interaction with users and getting the enhancement requirements.
  • Implementation and maintenance of the product in client sites.
  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Transformations Developer.
  • Involved in designing the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units info in tables.
  • Creating mappings and different kind of transformations like Source qualifier, Aggregators, lookups, Filters, Sequence Generator, Stored Procedure and Update strategy.
  • Involved in writing SQL Stored procedures to access data from Oracle 7.x

Environment: Informatica Power Center 5.1, Oracle 7.x, Flat Files, Erwin, SQL, PL/SQ, UNIX

We'd love your feedback!