We provide IT Staff Augmentation Services!

Sr. Etl Technology Lead Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Over 9 years of experience in designing, developing, and maintaining large business applications involving data migration, integration, conversion, and data warehousing.
  • Experience includes thorough domain noledge of Banking, Insurance (and reinsurance), Healthcare, Pharmacy, and Telecom industries.
  • Experience working with various versions of Informatica Power center - Client and Server tools
  • Business requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation.
  • Expertise in data warehousing, ETL architecture, data profiling and business analytics warehouse (BAW)
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations
  • Involved in generating reports from Data Mart using OBIEE and working with Teradata.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes.
  • Strong experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS.
  • Thorough noledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP.
  • Intense Knowledge in designing Fact & Dimension Tables, Physical & Logical data models using ERWIN 4.0 and Erwin methodologies like Forward & Reverse Engineering.
  • Experience in creating UNIX shell scripts and Perl scripts.
  • Knowledge in development of reports using Business Objects, Cognos and Micro strategy.
  • Knowledge in Installation and configuration of Informatica server with sql server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, Star & Snowflake Schema, Fact & Dimension Tables, Physical & Logical Data Modeling, Data Stage, Erwin

Business Intelligence Tools: Business Objects, Cognos

Databases: MS SQL Server, Oracle, Sybase, Teradata, MySQL, MS-Access, DB2

Database Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

Development Languages: C, C++, XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Other Tools and Technologies: MS Visual Source Safe, PVCS, Autosys, cron tab, Mercury Quality center

PROFESSIONAL EXPERIENCE

Confidential, Dallas TX

Sr. ETL Technology Lead

Responsibilities:

  • Acted as ETL Tech lead in handling projects dealing with huge data transition and Migrations and also worked as Informatica administrator setting up users, Installed and configured Informatica, maintained users, roles, groups, privileges
  • Acted as ETL Tech lead and managed several offshore and onshore teams to create Mappings from Flat files and connect with Xml’s.
  • Created several Informatica Mappings in all the EDW Environments ran the workflows and monitored it and used Informatica MDM for modifying existing data models in it.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Configured match rule set property by enabling search by rules in Informatica MDM according to Business Rules and also worked in transitioning data from Oracle Apps and also worked heavily on Oracle databases using PLSQL
  • Worked withInformaticaData Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Worked in Informatica Data Quality (IDQ) for creating rules, setting up rules, setting up reference tables etc and worked exclusively on Metadata Manager to browse, analyze and manage metadata from different repositories and search Metadata objects
  • Used Informatica Data quality to perform data profiling and Created Mappings Connecting the Xml’s with Payload Db2 Databases and WPS tables in the Event store databases.
  • Created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional modelling.
  • Setup and Management of N-node Hadoop Cluster including institution of effective Monitoring and Alerting architecture using Ganglia, Nagios.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, H-base database and Sqoop.
  • Involved in Cluster coordination services through Zookeeper and Adding new nodes to an existing cluster.
  • Supported Map Reduce programs those are running on the cluster & Developed Java UDF’s for operational assist.
  • Involved in running Hadoop Streaming jobs to process Terabytes of data.
  • Created several Informatica Mappings in all the EDW Environments ran the workflows and monitored it.
  • Also Created Audit and Controls using Teradata Metadata tables and ran the scripts using Audit and controls statistics also performed Business and System analysis and TEMPhas experience handling scenarios with 3NF and star schema
  • Used Teradata Utilities like Fast Load to move large volumes of data using multiple database sessions.
  • Used Informatica DT Studio components like Parser, Serialiser etc and created customized XML schemas which are configured using Unstructured Data transformation in Informatica, worked on complicated data exchange scenarios to move data across Mainframes
  • Accessed Informatica DT Studio projects, created well versed DT Studio scripts which are uploaded in server for usage of modifying existing Informatica schemas using unstructured data transformation.
  • Created several Design patterns, Standards documents, and experience in ETL systems explaining all above mentioned processes, used DT Studio Data transformation studio authoring and created .tgp scripts, extracted reports using Cognos.
  • Acted as Data integration consultant in integrating several databases and also tested the Databases using PLSQL and analyzed several mapping scenarios has Sql Development expertise with Oracle and Sql server 2008, created High level task plan and deliverables
  • Created all the paths and folders accessible from UNIX and ran all the Informatica workflows in UNIX using cron tab and performed Data Analysis using advanced techniques
  • Created SSIS packages to clean and load data to data warehouse and to transfer data between OLTP and OLAP databases.
  • Created SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
  • Deploying and scheduling Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.

Environment: Teradata 14, Erwin, Informatica Power Center 9.1 version, Informatica power exchange, Informatica DT, Sql Server 2014,Oracle 11, BAW, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential

Sr. ETL BI Data warehouse developer

Responsibilities:

  • Acted in Modeling, Estimation, Requirement Analysis and Design of mapping document and planning using ETL, BI tools, MDM, Toad by various environmental sources
  • Acted in coordinating experience in ETL systems and weblogs and planned analysis using Deliverables include ERD, Data Models, Data Flow Diagrams, Use Cases, Gap Analysis and process flow documents and has expert understanding of Ralph Kimball
  • Worked in Big Data Analytics using Cassandra, Couch DB, Map Reduce and relational databases.
  • Worked on upgrading and data analysis of ERIC (Employment resource information center) data modeling and practiced Informatica B2B power center and B2B data exchange
  • Extensively worked with Teradata in data Extraction, Transformation and loading from source to target system using Teradata loading utilities Bteq, Fast Load, and Multi Load and Tpump and also performed push down optimization with Teradata
  • Worked with Teradata 14 version and involved in writing scripts for loading data to target data Warehouse has Sql Development expertise with Oracle and Sql server 2008 and worked exclusively on Metadata Manager to browse, analyze and manage metadata from different repositories and search Metadata objects and also worked in transitioning data from Oracle Apps
  • Designed and developed ELT (Extract transform & Load) solutions for Bulk transformations of client’s data coming from Mainframe Db2 and performed Data Analysis using advanced techniques of B2B Data exchange and TEMPhas experience handling scenarios with 3NF and star schema
  • Used Informatica DT Studio components like Parser, Serialiser etc and created customized XML schemas which are configured using Unstructured Data transformation in Informatica.
  • Accessed Informatica DT Studio projects, created well versed DT Studio scripts which are uploaded in server for usage of modifying existing Informatica schemas using unstructured data transformation.
  • Created several Design patterns, Standards documents, ETL strategies explaining all above mentioned processes, used DT Studio Data transformation studio authoring and created .tgp scripts.
  • Developed complex Pl/Sql procedures and packages as part of Transformation and data cleansing.
  • Developed and used Informatica MDM Components for data governance projects and data integration projects and performed Dimensional modelling extracted reports using Cognos.
  • Set up batches and also used Informatica MDM Components like Informatica data director and data controls also performed Business and System analysis created High level task plan and deliverables
  • Extensively used Informatica debugger to validate Mappings and to gain troubleshooting Information about data and error conditions and also participated in file admin tasks.
  • Designed, developed Informatica mappings using Informatica 9.1, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Used Informatica Data Quality (IDQ) for creating score cards and performing column analysis by doing detailed cleansing and also worked in analyzing PHI Data and also worked heavily on Oracle databases using PLSQL
  • Also participated in installing Informatica data quality IDQ server and performed data manipulation
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Performing the ETL operations to support the data loads and transformations using SSIS.
  • Involved in migration from Oracle to SQL Server using SSIS
  • Developed Cubes using SQL Analysis Services (SSAS).
  • Deployment of SSAS Cubes to the production server, worked in Installation of Facets software.
  • Generation of reports from the cubes by connecting to Analysis server from SSRS.
  • Experience in Developing and Extending SSAS Cubes, Dimensions and data source view SSAS-Data Mining Models and Deploying and Processing SSAS objects.
  • Performed Configuration Management to migrate Informatica mappings/sessions /workflows from Development to Test to production environment.

Environment: Teradata 14, Erwin, Informatica Power Center 9.1 version, Informatica power exchange, SQL Server 2008/2012, Oracle 11, BAW, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential, Columbus Ohio

Senior ETL Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Data Mart.
  • Experience in ETL systems to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology for EDW and weblogs.
  • Developed mappings to extract data from SQL Server, Oracle, Teradata 12, Flat files and load into Data Mart using the Power Center and acted as PR4/PR3 programmer for manipulating CDC modules also used Pentaho for ETL and performed Dimensional modelling and TEMPhas experience handling scenarios with 3NF and star schema
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions, created High level task plan and deliverables and also worked heavily on Oracle databases using PLSQL
  • Used Informatica Designer in Informatica 9.0 to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Developed Slowly Changing Dimension for Type 1 SCD and worked on ETL ODI applications.
  • Used mapplets for use in mappings thereby saving valuable design time and effort and also worked in Hyperion integration also performed Business and System analysis
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, work lets and workflows accessed information using Mainframe DB2 written Cobol Coding.
  • Written procedures, Queries to retrieve data from DWH and implemented in DM also connected Informatica and performed Data Analysis using advanced techniques
  • Data extraction and data Transfer from and to SQL Server Database using utilities / tools like Toad and BULK INSERT and work on contingency plan using SQL Queries extracted reports using Cognos.
  • Used Store Procedure as Data provider to retrieve data from scheduled tables and complex queries
  • Developed centralized schema console using Business Analytics warehouse (BAW), wrote analytical queries, designed and developed ELT (Extraction, loading, Transformation) solutions and ensured only checked data is loaded has Sql Development expertise with Oracle and Sql server 2008
  • Developed core system components utilizing the SQL, Oracle, Informatica, Maestro, and Harvest.
  • Wrote SQL queries, triggers, and PL/SQL procedures to apply and maintain the business rules.
  • Efficient in writing complex T-SQL queries using Joins and automated applications using Vb scripting.
  • Experience in creating SSIS packages and migrating DTS packages from SQL server 2005 to SQL server 2008.
  • Extensive ETL experience using DTS/SSIS for data extraction, transformation and loading from OLTP systems to ODS and OLAP systems.

Environment: Informatica Power Center 9.0/8.6, SQL Server 2008, Oracle 11i/10g, Teradata 12, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential, Lansing MI

Sr. ETL Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs.
  • Experience in ETL systems to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables, accessed AS400 mainframe db2 systems with COBOL.
  • Tested the reports like Drill Down, Drill up and pivot reports generated from Cognos.
  • Used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio and Pentaho.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Checked performance tuning/debugging Confidential different levels like workflows, mappings, database etc,. And documented using Microsoft office, performed shell scripting.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel and performed Dimensional modelling.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment also troubleshooted data issues with Oracle warehouse builder.
  • Using cognos, Developed the Web Intelligence and Full Client reports
  • Performed System Testing, Regression Testing, Acceptance Testing, Functional Testing and Stress Testing also performed Business and System analysis has Sql Development expertise with Oracle and Sql server 2008
  • Created reports like Master/Detail reports, Cross Tab reports, slice and dice reports, and drill down reports and performed Data Analysis using advanced techniques

Environment: Informatica, SQL Server 2008, Teradata 6, Oracle 9i, DB2,SQL, PL/SQL, Mainframes, Sun Solaris, UNIX Shell Scripts, Business Cognos 8, Erwin, Autosys, Remedy.

Confidential, Lansing MI

Sr. ETL developer

Responsibilities:

  • Worked closely with business users while gathering requirements, analyzing data and supporting existing reporting solutions.
  • Involved in gathering of business scope and technical requirements and created technical specifications and worked exclusively on HIPPA standards including EDI transaction set -837, Claim-payment advice -835,834,270 etc.
  • Developed complex mappings and SCD type-me, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations.
  • Created complex mapplets for reusable purposes, Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time and worked in analyzing PHI Data.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Fine-tuned existing Informatica maps for performance optimization, also used MQ series for passing distributed data and also worked on Power center and Power exchange B2B.
  • Worked on Informatica Designer tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches and performed Dimensional modelling has Sql Development expertise with Oracle and Sql server 2008
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Debugged mappings by creating logic dat assigns a severity level to each error, and sending the error rows to error table so dat they can be corrected and re-loaded into a target system.
  • Involved in the Unit testing, Event & Thread testing and System testing and performed Data Analysis using advanced techniques
  • Analyzed existing system and developed business documentation on changes required.
  • Made adjustments in Data Model and SQL scripts to create and alter tables.
  • Extensively involved in testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation.
  • Worked on various issues on existing Informatica Mappings to produce correct output.
  • Database relationships & Data Models also performed Business and System analysis
  • Involved in intensive end user training (both Power users and End users in Report studio and Query studio) with excellent documentation support and experience in ETL systems
  • Involved in migration from Oracle to SQL Server 2008 using SSMS and SSIS and using C, C++
  • Developed Cubes using SQL Analysis Services (SSAS).
  • Deployment of SSAS Cubes to the production server, worked in Installation of Facets software.
  • Generation of reports from the cubes by connecting to Analysis server from SSRS.
  • Experience in Developing and Extending SSAS Cubes, Dimensions and data source view SSAS-Data Mining Models and Deploying and Processing SSAS objects.

Environment: Informatica, Oracle 10g/9i, SQL, SQL Developer, Windows 2008 R2/7, Toad, Sql Server 2008

Confidential, Seattle WA

ETL developer

Responsibilities:

  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into Business intelligence database.
  • Based on the EDS business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Reviewed data models using Erwin tool to find out data model dependencies.
  • Designing and developing ETL solutions in Informatica Power Center 8.6 and Toad, performed shell scripting.
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files from web servers and hosted files.
  • Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables for some e-commerce based applications.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window.
  • Effectively used all the kinds of data sources to process the data and finally creating load ready files (LRF) as out bound files which are inputs to the BID.
  • Created Workflow, Work lets and Tasks to schedule the loads Confidential required frequency using Maestro scheduling tool. Created Maestro control files to handle job dependencies.
  • Expertise in writing BTEQ scripts in Teradata and running them by writing korn shell scripts in HP UNIX and Sun OS environments.
  • Created Dashboards using Crystal X-Celsius for senior management for business decision-
  • Making for BO Mobile interfaces.
  • Performed Server Management Tasks, using Central Configuration Manager (CCM) and Central
  • Management Console (CMC), worked in Unix and windows.
  • Worked on SAP BI reporting which involves Query building, Filtering, Free Characteristics,
  • Restricted Key Figures and Variables, Query variants using BEx Analyzer.
  • Extensively worked on BO XI 3.1 for reporting purposes.
  • Involved in Adhoc query development and data mining.
  • Extensively worked in INFOVIEW to create Web Intelligence, Desk Intelligence Reports over the Universe Created
  • Expertise in creating MLOAD, Fast load and T Pump control scripts to load data to the BID.
  • Expertise in creating control files to define job dependencies and for scheduling using maestro tool.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 8.6, Business objects, ETL, Teradata V2R5 as a BID, Business Objects, Oracle 10g/9i/8i, HP - Unix, Sun OS, Perl scripting, Erwin, PL/SQL, Maestro for scheduling.

Confidential, Sacramento, California

ETL Informatica Developer

Responsibilities:

  • Based on the requirements created Functional design documents and Technical design specification documents for ETL Process and worked exclusively on HIPPA standards including EDI transaction set -837, Claim-payment advice -835,834,270,277,278 and also worked on HIPAA 5010 transactions
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load intervals.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Designed and implemented mappings using SCD and CDC methodologies.
  • Designed and developed process to handle high volumes of data and large volumes of data loading in a given load window.
  • Extensively involved in migration of ETL environment, Informatica, Database objects.
  • Involved in splitting of Enter price data warehouse environment and Informatica environment in to 3 of each company.
  • Defined ETL Jobs with Data Integrator for loading data from Source to Target and Data Integration with SAP BW Systems
  • Documented the interfaces developed and provided basic training to the users.
  • Created a Business Objects Data integrator jobs which gives the details of the transformations and the sample scenarios in which they are used performed business logics to the data pulled from SAP Hana systems
  • Created and maintained BO Universes from BW queries and info cubes.
  • Developed Xcelsius Dashboard integrated with QAAWS and Live Office.
  • Created Web me reports with Complex Calculations to get Alerts for Xcelsius Dashboards.
  • Worked in creating Variables, Prompts in web me.
  • Worked on Hierarchy Levels and developed hierarchies to support drill down reports.
  • Created inventory accuracy Dash Boards (Xcelsius) and for management reports

Environment: Informatica Power Center, Business objects data integrator, Business objects, MS Sql server 2008, Oracle 9i/8i, Trillium, HP UNIX, Erwin 4.2, PL/SQL.

Confidential, Portland Oregon

ETL Informatica Developer

Responsibilities:

  • Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications and worked exclusively on HIPPA standards including EDI transaction set -837, Claim-payment advice -835,834,270 etc. and also worked on HIPAA enforcement rules and transaction rules
  • Analyzed and created Facts and Dimension tables.
  • Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning and object creation) for Oracle as per business requirements using Erwin
  • Used DB2, legacy systems, Oracle, and Sybase sources and Oracle as target.
  • Developed Informatica Power Center mappings for data loads and data cleansing.
  • Wrote stored procedures in PL/SQL and Unix Shell Scripts for automated execution of jobs
  • Wrote Shell Scripting for Informatica Pre-Session, Post-Session Scripts.
  • Designed technical layout considering Standardization, Reusability, and Scope to improve if need be.
  • Documented the purpose of Data Warehouse (including transformations, mapplets, mappings, sessions, and batches) so as to facilitate the personnel to understand the process and in corporate the changes as when necessary.
  • Developed complex mappings to extract source data from heterogeneous databases Tera- Data, SQL Server Oracle and flat files, applied proper transformation rules and loaded in to Data Warehouse.
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Worked closely with Production Control team to schedule shell scripts, Informatica workflows and pl/sql code in Auto-sys.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Defects were tracked, reviewed and analyzed.
  • Conducted UAT (User Acceptance Testing) with user community
  • Developed K-shell scripts to run from Informatica pre-session, post session commands. Set up on Success and on Failure emails to send reports to the team.

Environment: Informatica, Oracle, PL/SQL, Cognos Impromptu 6.0, Cognos Power Play 6.6, Oracle 9i, Erwin 4.0, UNIX, Windows NT.

We'd love your feedback!