We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

2.00/5 (Submit Your Rating)

Boston, MassachusettS

PROFESSIONAL SUMMARY:

  • Over 12 + years of working experienced as Data Architect/Modeler and Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in OLTP, DataWarehousing, OLAP and ETL Environment.
  • Experienced in designing Star schema (identification of facts, measures and dimensions), Snowflake schema for Data Warehouse,ODSArchitecture by using tools like Erwin,Power Designer, E - R Studio and MicrosoftVisio.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008/2012,Teradata, Netezza and BigData
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Extensive experience in supporting Informatica,data extraction from heterogeneous sources using InformaticaPower Center.
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
  • Experienced in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Excellent experience working on Netezza and writing heavy SQL queries.
  • Used BTEQ as utility and query tool for Teradata,Querytool, toload dataa row at a time into Teradata and toexport dataof Teradata a row at a time.
  • Used BTEQ can be used to submit SQL in either a batch orinteractiveenvironment.
  • Worked with BTEQ to get outputs in areport Format, where Queryman outputs data in a format more like a spreadsheet.
  • Also used Bteq tool for importing and exporting data.
  • Familiarity with the DB2.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Excellent knowledge in preparing required project documentation and trackingand reporting regularly on the status of projects to all project stakeholders
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos, Tableau, Microstrategy and Business Objects.
  • Extensive ETL testing experience using Informatica9x/8x,Talend, Pentaho
  • Have good exposure on working in offshore/onsite model with ability to understand and/or create functional requirements working with client and also have Good experience in requirement analysis and generating test artifacts from requirements docs.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • An excellent team player& technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS INVENTORY:

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

Databases: Oracle 11g/10g/9i, Teradata R12 R13, R14 MS SQL Server 2005/2008/2012 , MS Access, Netezza

Tools:: MS-Office suite (Word, Excel, MS Project and Outlook), VSS

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

Operating System: Windows, Unix, Sun Solaris

ETL/Datawarehouse Tools: Informatica 9.5/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau, Pentaho

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Tools: &Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant

PROFESSIONAL EXPERIENCE:

Confidential, Boston, Massachusetts

Sr. Data Architect/Data Modeler

Responsibilities:

  • Provides the architectural leadership in shaping strategic, business technology projects, with an emphasis on application architecture. Utilizes domain knowledge and application portfolio knowledge to play a key role in defining the future state of large, business technology programs.
  • Creates ecosystem models (e.g. conceptual, logical, physical, canonical) that are required for supporting services within the enterprise data architecture (conceptual data model for defining the major subject areas used, ecosystem logical model for defining standard business meaning for entities and fields, and an ecosystem canonical model for defining the standard messages and formats to be used in data integration services throughout the ecosystem).
  • Demonstrated experience in design and implementation of an enterprise data model, metadata solution and data life cycle management in both RDBMS, Big Data environments.
  • Lead and provide suggestions and best practices for end to end data delivery
  • Design and architect with the right balance of performance, scalability, ease of consumption
  • Design and architect to accommodate data quality
  • Design and architect to integrate multiple, complex data sources that are transactional and non-transactional, structured and unstructured data
  • Identifies user requirements by researching and analyzing user needs, preferences, objectives, and working methods and studies how users consume content, including data categorization and labeling.
  • Created logical/physical data models using IBM Infospehre Data Arhitect for new requirements and existing databases, maintained database standards, provided architectural guidance for various data design/integration/migration and analyzeddata in different systems.
  • Applies architectural and technology concepts to address scalability, security, reliability, maintainability and sharing of enterprise data.
  • Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Worked ondatabase design, relational integrity constraints, OLAP, OLTP, Cubes and Normalization(3NF) & De-normalization of database.
  • Extensively used AginityNetezza work bench to perform various DML, DDLetc operations on Netezza database.
  • Developed logical and physical model for schemas, created standard data documentation, such as ERD and Flow Diagram.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modeling tool and ORACLE PL/SQL.

Environment: Erwin r9.6, Normalization and De-Normalization, UNIX, Teradata SQL Assistance, MDM/Activators, Netezza, Aginity, Star and Snow Flake DDL, PL/SQL, ETL, DB2, Associated Data Marts, Data Stores, DB2, Oracle12c and Teradata15 etc.

Confidential, Overland Park, KS

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Involved in data analysis and creating data mapping documents to capture source to target transformation rules.
  • Worked with business users to gather requirements and create data flow, process flows and functional specification documents.
  • Used Erwin and Visio to create 3NF and dimensional data models and published to the business users and ETL / BI teams.
  • Developed and maintaineddatadictionary to create metadata reports for technical and business purpose.
  • Integrated and Implemented MDM, Business intelligence(BI) and Data warehousing (DW/ EDW) solutions.
  • Coordinating and supporting development and testing activities in Teradata Semantic layer, Business Objects Universe and reports.
  • Involved in Datamapping specifications to create and execute detailed system test plans. The datamapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Worked on creating role playing dimensions, fact less Fact, snowflake and starschemas.
  • Analysis of functional and non-functional categorized data elements for dataprofiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Prepared Testcases based on Technical Specification document.
  • Involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Responsible for testing all new and existing ETL data warehouse components.
  • All Mappings & workflows succeeded in Testing Environment move from Development to Production Environment.
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Interacted with functional analysts to understand requirements and write high level test scripts.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata.
  • Reviewed ERD dimensional model diagrams to understand the relationships and cardinalities to help in preparation of integrity test cases.
  • Written test cases for data extraction, data transformation, and reporting.
  • Responsible for Testing Schemas, Joins, Data types and column values among source systems, Staging and Datamart.
  • Analyzed the objectives and scope of each stage of testing process from the Test plan.
  • Interacted with business analysts to gather the requirements for business and performance testing.
  • Reporting tools, query tools, extraction and cleansing tools, transformation tools Using Metadata.
  • Responsible for performing the datavalidation, process flow, dependency, Functionality Testing and User Acceptance Testing.
  • Extensively used QualityCenter to prepare test cases, execution of test cases and bug tracking.

Environment: Erwin 4.5/4.0, Informatica Power Center 8.1/9.1, ODBC, Power Connect/ Power exchange, Oracle 11g, Main frames, MES, Reference Data, R,, DB2 MS SQL Server 2008 R2, SQL,PL/SQL, XML, Metadata, Windows NT 4.0, Power Query,Talend, Unix Shell Scripting.

Confidential, Chicago, IL

Sr. Data Modeler, Data Analyst

Responsibilities:

  • Worked closely with development team and business representatives to fully understand data requirements to support business function.
  • Created Dimensional data Model, that contain the measurements or metrics or facts of business processes and resolve many-to-many relationships in fact tables.
  • Created conformed dimensions to integrate data across the enterprise.
  • Worked as Primary Data Modeler for many databases which are classified into several subject areas. Provide technical consulting in the definition, design, and creation of a database environment.
  • Co-ordinated with Business Analyst for creating Conceptual Data model.
  • Responsible for the development and maintenance of Logical and Physical data models, along with corresponding metadata, to support Confidential Photo Online Applications.
  • Worked with project team representatives to ensure that logical and physicalERwin data models were developed in line with corporate standards and guidelines.
  • Hands on experience in designing data models and working with database developers to implement reference data applications.
  • Responsible for developing, designing and implementing the standards and processes to support the use of the Repository to inventory, catalog and manage the metadata of Highmark.
  • Created Logical Data Models and Physical Data Models as per business rules and application of correct naming standards.
  • Changes in the data model as per the customer needs Co-ordinate with DBA and application developer to resolve any discrepancies in the data model in regards to current model.
  • Generated Physical/Logical reports; Data dictionary and Complete compare reports for the changes applied to the data model on the temporary subject area using the Highmark ERwin Toolbox.
  • Reverse Engineer the database and complete compare for synching the model and after returning the model from DBA resolving the differences.
  • Hands on experience working with Oracle and DB2 database.
  • Designed and implemented databases to support Confidential efforts.

Environment: Erwin 9.5, Oracle 12c, SQL, PL/SQL, Informatica Power Center 9.6.1, Toad UNIX, DB2, Windows NT, MS SQL Server 2000/2005, MS Access, Unix, Win XP.

Confidential, Minneapolis, MN

Sr. Data Modeler/ Data Analyst/Data Architect

Responsibilities:

  • Worked on discovering entities, attributes, relationships and business rules from functional requirements
  • Used Erwin R9.0, created Conceptual, Logical and Physical data models.
  • Translated logical data models into physical database models, generated DDLs for DBAs
  • Worked on data mapping and documenting ETL transformations
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Worked on creating Indexes to improve the performance, constraints on various tables
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Expertise and worked on Physical, logical and conceptual data model
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Perform data reconciliation between integrated systems.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Collected, analyze and interpret complex data for reporting and/or performance trend analysis
  • Wrote ad-hoc SQL queries and worked with SQL and Netezza databases.
  • Experienced in forecasting the data space requirements using the features likeCapacity planning, mapping source to target data usingdata lineagefeature and also calculating theimpact analysis.
  • Creating views and DB2 objects depending upon the application group requirement.
  • Collaborated with the data warehousing team, ensuring that data infrastructure supports the needs of analytics team and validating data quality.
  • Developed clear, concise, actionable recommendations from mountains of data.
  • Generate DDL, JCL SQL*LOADER control files, modify scripts/files and loading the data into Oracle.
  • Advocated for exploration of interesting data anomalies or patterns that provided more explanatory detail about customer behaviors and predictive value to the business.
  • Worked with engineers to develop, test, and maintain the accurate tracking, capturing and reporting of key product usage metrics via both server-side data collection and web analytics tagging.
  • Developed enterprise data solution frameworks related to data design and logical solution partitioning
  • Developed Enterprise Level data model and data content designs.
  • Setting data architecture standards for enterprise information delivery solutions.
  • Involved in thedata migration, data lineageand also studying thedata impact analysis.
  • Architectural oversight of data design from analysis, modeling and mapping, through development and testing.
  • Worked on Physical design for both SMP and MPP RDBMS, with understanding of RDMBS scaling features.
  • Experience in Database performance tuning, optimization and maintenance practices

Environment: T-SQL, ETL Tools Informatica9.5, Oracle 11G, Netezza

Confidential, New York City, NY

Sr. Data Modeler/ Data Analyst/Data Architect

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Indulge in Power designer tool, Informatica 9.5/9.1/8.6.1/8.1 (Repository Manager, Designer and Workflow Manager.
  • Involved with dataprofiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Expertise and worked on Physical, logical and conceptual data model
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Used DDL and DML to writing triggers, stored procedures to check the data entry and payment verification
  • Experienced in source to target data mapping usingData lineage techniqueand also calculating the impact on the data flow usingImpact Analysis.
  • Worked and experienced on Star Schema, DB2 and IMS DB.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files,SQL Server 2005 with high volume data.
  • Starting and stopping Db2 subsystems.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Analysis on Mainframe data to generate reports for business users.
  • Identify & record defects with required information for issue to be reproduced by development team.

Environment: PL/SQL, Business Objects XIR3, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R12/R13, Teradata SQL Assistant 12.0,DB2

Confidential, Chicago, IL

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Excellent experience and knowledge on datawarehouse concepts and dimensional data modeling using Ralph Kimball methodology
  • Responsible for different Data mapping activities from Source systems to EDW and data marts
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Used CA ERwin Data Modeler(ERwin) fordata modeling(datarequirements analysis,database designetc.) of custom developed information systems, including databases of transactional systems anddata marts.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata

Confidential, Chicago, IL

Sr. DataAnalyst/Data Modeler

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata

Confidential, Livonia, Michigan

Data Analyst

Responsibilities:

  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
  • Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc.
  • Involved in TeradataSQL Development, Unit Testing and Performance Tuning
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Tested the ETLInformatica mappings and other ETL Processes (Data Warehouse Testing)
  • Tested several stored procedures.
  • Validated several Business Objects reports. Reviewed and tested business requests for data and data usage.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Responsible for Data mapping testing by writing complex SQL Queries using WINSQL
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Validating the data passed to downstream systems.
  • Worked with Data Extraction, Transformation and Loading (ETL).
  • Involved in testing data mapping and conversion in a server based data warehouse.
  • Involved in testing the UI applications
  • Involved in Security testing for different LDAP roles.
  • Tested whether the reports developed in Business Objects are as per company standards.
  • Used Quality Center to track and report system defects
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Informatica 8.1/7.1, Business Objects, SQL, SQL Server 2000/2005, Teradata V2R6 (MLOAD, FLOAD, FAST EXPORT, BTEQ), TeradataSQL Assistant 7.0,Toad, UNIX, Shell Scripting

Confidential

Data Analyst

Responsibilities:

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Developed separate test cases for ETL process (Inbound & Outbound) and reporting
  • Involved with Design and Development team to implement the requirements.
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyze and compare results using Quality Center.
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for ETL jobs to upload master data to repository.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
  • Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, DB2, UNIX AIX 5.2, PERL, Shell Scripting

We'd love your feedback!