Data Architect/tech Lead Developer Resume
North Brunswick, NJ
PROFESSIONAL SUMMARY:
- Overall 16+ years of experience in IT including over 13+ years of strong experience in Designing innovative business solutions, Data Modeling, Data Analysis, SDLC on Enterprise Data Warehousing and Business Intelligence, Big Data - Hadoop Ecosystem with an efficient Business Analytical, Strategic Roadmap Planning and Solution Architecture skills.
- Hands on 13+ years of SME level experience in Ab Initio, major ETL/BI tools, and Requirement gathering, Design, Estimation, Architecture, Implementation and Delivering in various SDLC of Agile/Scrum, DevOps, Lean and Waterfall Models.
- More than 10 yrs of Functional experience in Financial - Banking domain, such as Capital Markets, Investment Banking, Credit Risk Management, Asset Management, Basel Regulatory, BCBS 239, Dodd-Frank, CCAR, VaR, CVA, Compliance Risk Assessment, AML Implementation, Analytics Platform and Monitoring, and Compliance Regulatory Reporting.
- Hands on 2 years of experience in Big Data HDFS Ecosystems using Hadoop, Spark, Python, HDFS, HIVE, Hbase, Kafka, Sqoop, Flume, YARN, Cloudera, MapReduce, MongoDB, Pivotal Green plum(MPP) and Talend Big Data Integration.
- Hands-on Leading, Interacting and Managing experience in ETL/ELT architectures on Data Modeling, Data Integration, Data Migration, Master Data Management, Metadata Management and Data Governance implementations with Best practices and Strategic Approaches in EDW Data Warehousing/Business Intelligence and Big Data HDFS Ecosystem..
- Well Practiced in Planning, Execution and Delivering, Best Practices, Standardization, and Data Strategies in Data Quality, Data Migration, and Data Integration on Master Data Management, Metadata Management and Data Governance implementations.
- Extensive hands on ETL tools: Ab Initio, MS SSIS, Talend Data Integration and Informatica 9.x with CA ERwin 9.0.
- Expertise in all Ab Initio Products: GDE, Conduct>It, Co>Operating System, EME, Technical Repository, BRE, ACE, DQE (Data Profiler), Metadata Hub (MDH), Operation Console, Control Center, Express>It, Query>It, SFD, ICFF, Micrograph, Continuous Flow, Performance Tuning, Code Reviewing, EME Standard and Best Practices in Ab Initio.
- Beginner expertise in BI Tools: SAP Business Objects 6.5, MicroStrategy 10, QlikView, Actuate Analytics and MS OLAP.
- Strong Experience in RDBMS on ORACLE 10g/11g, Teradata 15.x, Sybase ASE 15.x, MS SQL Server 2014, and IBM DB2.
- Extensive experience in Shell Scripting, AWK, SED, Perl, Python and Control-M/Autosys JIL Programming/Scheduling.
- Experience in Data Modeling Schemas (RDBMS, Multi-Dimensional Modeling (Star Schema, Snowflake), MOLAP, ROLAP and HOLAP) using CA Erwin Data Modeler r9.x, SAP Power Designer and SQL Data Modeler.
- Worked on Client/Server Technology using J2EE, Python, Microsoft Technology Tools: MS DTS, MS OLAP, SSIS and SSRS.
TECHNICAL SKILLS:
Language(s): C/C++, Python, JCL, J2EE, JMS, XML, UML Modeling, XSDL, SQL, PL/SQL, SED, AWK, UNIX Shell/Perl Scripting
GUI: Visual Basic 6.0, ActiveX, and ADO 2.5.
Big Data Technologies: Apache Hadoop, Spark 2, Python 3.x, MapReduce, Hbase, HIVE, MongoDB, SqoopFlume, Oozie, and YARN with Cloudera/Hortonworks Clusters.
Data Warehousing - ETL Tools: Ab Initio 3.x, Ab Initio GDE, Ab Initio Co>Operating System 3.x, MS SSIS, Elementum 3.x Talend Bigdata, Informatica Power Center 9.x, Oracle ODI, MS DTS, MS SSIS, MS SSRS.:
OLAP Tools: MS SSAS, SSRS, MicroStrategy 10.x, QlikView 12.x, SAP Business Objects 6.5.
Database Modeling Tool: MS Visio, Erwin Data Modeler r9, Oracle Data Modeler, SAP Power Designer.
Database(s): ORACLE 10g/11g, Teradata 15.0, Sybase ASE 1.5, IBM DB2 UDBEEEMS SQL Server 2000/2005/2008/2014 , IBM Netezza 3.0.
Web Technologies: Active Server Pages 3.0, VB Script, and JavaScript.
Web/Application Server: Servlets, IIS 4.0, MTS, JRun Server.
Operating System: MS Office Suites, MS Visio, Windows, Windows NT, SUN Solaris 5.8, HP-UX, IBM-AIX, Red hat Enterprise Linux 3.0 and Cygwin 10.0.
Job Scheduling Tools: IBM Tivoli, Control-M, Autosys, and UNIX Crontab.
Other Tools: TOAD 11.5, InterScope 7.0, AquaData, SQuirreL SQL, ScriptIT
PROFESSIONAL EXPERIENCE:
Data Architect/Tech Lead Developer
Confidential, North Brunswick, NJ
Responsibilities:
- Gathered all the Sources of CLRTY 2.0 such as EIW, SSDM, SHAW, ECaR, ACAR, HOGAN, DRM and Consumer Lending Mortgages Data of WFHE and WFHM.
- Worked on Program Strategy and Roadmaps Planning for this DevOps, Agile/Scrum Methodology with Product Owner, Senior Management Team, Senior Operation Manager, Directors, Program Manager, and Downstream stakeholders.
- Analyzed the CLRTY 2.0 requirement of all the implementations: SSDM, FDR to SHAW, Single MSP ETL and CLRTY DQE.
- Prepared the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Interacted closely on all the deliverables and any issues with End Users, Business Analyst, Data Analytics, Program Mangers, Team Leads, DBA Teams and Ab Initio Tech Leads along with Ab Initio Vendor Team.
- Contributed the design of new architecture of CLRTY 2.0 framework for the replacement of SQL based data processing to ETL/ELT based and assisting to Senior Data Architects, Consumer Lending Experts, BA and Senior Managers.
- Designed the CLRTY 2.0 Architectures/framework for DQE/DMC in Data Governance of BCBS 239 Implementation, Dimensional Data Mart/Star Schema Modelling using MS Visio 2016, Erwin Data Modeler 9.0.
- Working on the Ab Initio - Hadoop Integration prototypes and Co>operating System 3.3.2 integration with YARN.
- Managed the Data Lineage, Source Versioning, Data Quality Standardization, Logical Models, Physical Models, Impact Analysis, Organized the correlated projects, supporting the Operational issues in Data Dictionary, Master Data Management, Metadata Management and Data Governance using Ab Initio EME, Express>It, DQE, Enterprise Metadata Hub(EMH) along with SQL/PLSQL.
- Supported on the enhancement of existing development using Informatica 9.X, SSIS, SSAS, SSRS with SQL and Python 3.6 on SQL Server 2008/2014.
- Managing the migration from Informatica to Ab Initio implementation and provided the technical supports on both ETL tools.
- Performed the Ab Initio/Generic Graphs development with these features: MFS Data Parallelism, PDL Approach, MFS, JMS Queue Batch Processing, Component Folding, all the performance optimization of Ab Initio resource utilization on high volume processing and develop the UNIX Shell Scripting using AWK and SED.
- Created the BRE (Express>It 3.2) workflow/templates for reusable business application configurations (appconf) and rulesets on various CLRTY 2.0 DQE based implementations for BI DQ Analytics Dashboard Apps with BA and QA Team.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in QA/User Acceptance testing for the deliverables.
- Managed the Hanger/Production Migration Deployments and maintained the streamed strategy on the deployments.
- Supported the Job Automation schedules (continuous and batch) on daily basis and provide the On-Call support.
- Optimized the performance issues and impact analysis with end users on data process and analytical solutions providing for any production issues.
Environment: Ab Initio GDE 3.3.2, Ab Initio Co>Operating System 3.3.2 and 3.2, Ab Initio EME 3.3.2, Express>It 3.3.2, Conduct>It 3.3, PDL, Informatica Power Center 9.6.1, Informatica Data Quality 9.6.1MS SSIS, MS SSAS, MS SSRS, Autosys JIL programming/Scheduling, SQL, PL/SQL, Python 3.6, SQL Developer 4.2, UNIX Shell Scripting, ORACLE Data Modeler 9.0, MS Visio 2016, Perl, AWK, SED, J2EE, JavaScript, JMS Queue, Atlassian JIRA Agile 6.4, Atlassian Confluence 6.4, Anaconda 3.x, Cloud Bees Jenkins Enterprise 2.4, DevOps, Cloud XML, XSD, CVS, Sun OS 5.10, Linux, Teradata 15.0, SQL Assistant 15.0, SQL Server 2008/2014, MongoDB, ORACLE 10/11g, Cygwin 10.0 and Windows 10/Windows 7.
Data Architect/Lead Developer/Analyst
Confidential, NY
Responsibilities:
- Captured all the Sources of EDBI System such as DCS-CM, PDS, PES, CDS-A, CDS-R, SBR (Sales), TDF (Ticket), SDF(Sales), E-Code, GatePass, and Compensation. These are various purpose sources of Airlines System’s operation.
- Interacted on Program Strategy and Roadmaps Planning for this Agile/Scrum Methodology with Product Owner, Senior Management Team, Senior Operation Manager, Directors, Program Manager, and Downstream stakeholders.
- Analyzed the requirement of this program implementation: DCS-CM Feed Processing for Passenger and Bag Activities, PDS/CDS Data Load Replication, Get-Pass, Compensation Processing and OFC Replication Process between Data Centers and Productization of all the implementation.
- Prepared the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Interacted closely on all the deliverables and any issues with End Users, Business Analyst, Data Analytics, Program Mangers, Team Leads, DBA Teams and Ab Initio Tech Leads along with Ab Initio Vendor Team.
- Implemented the Re-engineering of PDS/CDS Data Load Replication that process the loading of data into EDW and IDW env by single process with Data Modeling of Logical and Physical data model (Star/Snowflake Schema).
- Accomplished 100% deliverables on time in each sprint on Agile/Scrum planning that helped EDBI Program Migration.
- Designed the Dimensional Data Mart/Star Schema Modelling using Erwin Data Modeler 9.0, ORACLE SQL Modeler.
- Performed the Data Lineage, Source Versioning, Logical Models, Physical Models, Data Analysis using Ab Initio EME, Metadata Hub, and Technical Repository Portals.
- Exclusively worked in XML list/XSD sources using XML Components along with other regular data sources;
- Implemented the Ab Initio/Generic Graphs with these features: MFS Data Parallelism, PDL Approach, MFS, JMS Queue Batch Processing, Component Folding, all the performance optimization of Ab Initio resource utilization on high volume processing and develop the UNIX Shell Scripting using AWK and SED.
- Modified the existing BRE (Express>It 3.2) workflow/templates for reusable business configurations and rulesets on various Customer Rewards Program, Compensation and BI Analytics purpose along with Business Team.
- Developed the SQL, PL/SQL Procedures, Triggers and functions on Teradata 15.x and ORACLE 11i. Also used Teradata utilities: Fast Export, Fast Load, MultiLoad, Tpump, and BTEQ with Ab Initio for high performance and processing.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in QA/User Acceptance testing for the deliverables.
- Managed the Hanger/Production Migration Deployments and maintained the streamed strategy on the deployments.
- Supported the Job Automation schedules (continuous and batch) on daily basis and provide the On-Call support.
- Optimized the performance issues and impact analysis with end users on data process and analytical solutions providing for any production issues.
Environment: Ab Initio GDE 3.2.5, Ab Initio Co>Operating System 3.2.5 and 3.0, Ab Initio EME 3.2 and 3.0, BRE 3.1, ACE 3.1, Express>It 3.2, Conduct>It 3.2, PDL, IBM Trivoli Workflow Programming/Scheduling, Crystal Reporting, SQL, PL/SQL, Python 3.0, SQL Developer 4.2, UNIX Shell Scripting, ORACLE Data Modeler 9.0, Perl, AWK, SED, J2EE, JavaScript, JMS Queue, Atlassian JIRA Agile 6.4, Atlassian Confluence 6.4, XML, XSD, CVS, Sun OS 5.10, Linux, Teradata 15.0, SQL Assistant 15.0, ORACLE 10/11g, Cygwin 10.0 and Windows 10/Windows 7.
Data Architect/Lead Data Analyst
Confidential, Wall St, NY
Responsibilities:
- Proficiency level experience of the Compliance AML/Risk Monitoring Systems: MANTAS, Actimize, Citi’s Archer, AML Metrics, GLMS, NPA, Citi KYC, Mandatory Absence, Trade Surveillance and other AML Monitoring tools.
- Analyzed the requirement of various implementations: COMRAD CR8 Metrics Implementation, Prepaid Cards, MANTAS and Actimize Extractions, Prepaid Cards Metrics Calculation, Trade Surveillance, Non AML Monitoring, Compliance Horizontals (HARA) and Compliance Staffing.
- Gained the Data Analysis of AML Metrics/non AML Compliance, Risk Assessment on building the Compliance Risk Analytical Platforms for the stream of LOB (Line of Business) geographically with Compliance Risk officer and Compliance Management.
- Prepared all the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Interacted closely on all the deliverables and any issues with End Users, Business Analyst, Compliance Analytics, Compliance Officers, Compliance Regional Teams, DBA Teams and CATE Team along with Ab Initio Vendor Team.
- Achieved the Re-engineering of all COMRAD Applications to move to EDW/BI Architecture with help of Citigroup CATE team particularly; Data Modeling, Designed the Logical and Physical data model, separated all the sources as a set of correlated Star/Snowflake Schema Model Streams, Ab Initio graphs, Pre/Post processes, Job Running Wrapper Scripts and Scheduling the streamed jobs.
- Designed the Dimensional Data Mart/Star Schema Modelling using Erwin Data Modeler 9.x, SQL Data Modeler 3.2.
- Performed the EDW ETL BI Architecture that primarily designed the Logical and Physical of Dimensional, Relational Models such as Compliance Staffing, AML Metrics Analytics, MANTAS Metric Calculation and Actimize Analytics, Citi KYC Analytics with Citi CATE Standards and Best Approaches of Data Modelling.
- Managed the Data Lineage, Source Versioning, Logical/Physical Models, Impact Analysis, Organized the correlated projects under Master Data Management, supporting the Operational activities using Ab Initio EME, Metadata Hub, Technical Repository Portals.
- Contributed the Integration of MANTAS data extraction from EDW - EAP System that runs on Big Data - Hadoop thru Apache Spark/Hbase/MapReduce/HIVE platform using new feature of Ab Initio Hadoop Components.
- Implemented the Ab Initio/Generic Graphs with these features: MFS Data Parallelism, PDL Approach, MFS, Continuous Flow, Component Folding, Dynamic (ICFF) lookup, all the performance optimization of Ab Initio resource utilization on high volume processing and UNIX Shell Scripting using AWK and SED.
- Developed the Dashboard Platforms, QlikView Scripts, and SQL, PL/SQL Procedures, Triggers on ORACLE, Sybase.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in User Acceptance testing for the deliverables.
- Performed the issues, problems that caused the batch and continuous runs of 24*7 for supporting all regions users.
- Estimated the resource utilization, AIX to Linux Migration, Data Base/NAS/SAN Space requirements planning with Portfolio Architect Team and CTI Resource allocation teams.
- Managed the Production Deployments and maintained the streamed strategy on the deployment process.
- Provided the End-user support for all the production issues, resolve the transactional data processing issues, configuration issues and work with vendors for the application patch installation/deployments.
- Supported the Job Automation schedules (continuous and batch) on daily basis and provide the On-Call Support.
- Involved primarily the Disaster Recovery (DR) Test for any disaster failover testing and network problems.
- Optimized the performance issues and impact analysis with end users on data transactional process and analytical solution providing for any production issues.
Environment: Ab Initio GDE 3.2.5, Ab Initio Co>Operating System 3.2.5 and 3.0, Ab Initio EME 3.2, DQE(Data Profiler) 3.2 BRE 3.1, Conduct>It 3.2, Metadata Hub, Technical Repository, PDL, MicroStrategy 10.0, QlikView 12.x, Business Objects, SQL, PL/SQL, Python 3.4, SQL Developer 4.1, UNIX Shell Scripting, CA ERwin Data Modeler 9.0, Oracle SQL Modeler 4.0, Big Data, Hadoop, Talend Big Data Integration, Hbase, Apache Spark, MapReduce, Perl, AWK, SED, J2EE, JSP, JavaScript, Eclipse, XML, SVN, CVS, Sun OS 5.10, Linux, Autosys JIL/Control-M Scheduling, SYBASE ASE 15.x, Netezza 3.0, ORACLE 11g, IIS 7.0 and Windows 2000/Windows 7.
Data Architect/Lead Developer/Analyst
Confidential, Wall St, NY
Responsibilities:
- Proficiency level understanding the CAW System & Upstream source system: Trade Booking Front Office, ACE Server, Aggregator, CVA (Credit Valuation Adjustment), Global Market Risk & CE (Credit Engine) and Global Market Data.
- Analyze the requirement of various implementations of CAW for ACE - PSE (Pre Exposure Settlement), CVA, VaR, EPE, ETE, Dodd-Frank and 0-7 Day CreditRisk Exposure.
- Interaction closely with Business Analyst, Risk Analytics, Credit Risk Officers, Traders, CAW End Users, DBA Teams, CATE Team and Business Stake Holders on Road Map Planning and Strategic Approaches with Best Practices.
- Re-engineering the CAW Application for all the processes particularly Ab Initio graphs, Pre/Post processes and Wrapper Scripts.
- Archived the Re-engineering of CAW for all the processes particularly, Architecture issues, Data modeling, Designed the logical and physical data model, separated all the sources as a set of correlated Star/Snowflake Schema Model streams, Ab Initio graphs, Pre/Post processes and Job Running Wrapper Scripts.
- Prepared the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Integrated the implementation of 0-7 Day Portfolio, CCAR RWA (Risk Weighted Asset), Dodd-Frank Stress Testing and Basel Regulatory Reports with ACE, CE, and OPTIMA Teams and closely worked with CCAR Team on Risk Analytic and Liquidity Management.
- Developed the Ab Initio/Generic Graphs with these features for each purpose: PDL Approach, Data Profiler(DQE), MFS, Continuous Flow (CAW What-If Processing using JMS service from JBOSS Messaging), Component folding, Dynamic (ICFF) lookup, XML Processing and UNIX Shell Scripting using AWK and SED.
- Implemented the Parallelism approach and EME standards for the existing Ab Initio graphs that avoids the high utilization of resources and performance issues with best practices.
- Developed the SQL, PL/SQL Procedures and functions on Sybase for CAW GUI application data access.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in User Acceptance testing for the deliverables.
- Performed the performance turning issues & problems that caused the batch and continuous runs for the runs of 24*7 for supporting various regions of trading market (New York, London & Tokyo and LATAM regions).
- Provided the End-user Application support for all the production issues like OTC derivatives, PFE Exposure, What-If Analysis, ETF, 0-7day Portfolio and CEF Reports, Root-cause analysis, resolve the transactional data processing issues, configuration issues and work with vendor for the application patch installation/deployments.
- Supported the Job Automation schedules with on-call support for all trade markets (continuous & batch) on daily basis.
- Accomplished 100% deliverables on time and provided the Credit Risk based Analytical solutions on User support and data requests on any of Trade Portfolio Pricing in What-If for OTC derivatives, ETF and other Trading products.
- Involved primarily the Disaster Recovery (DR) for any disaster failover testing and network problems.
- Prepare the various documents or Process Design, Job Procedures and End-to-End Process document.
Environment: Ab Initio GDE 3.0, Ab Initio Co>Operating System 3.0, Ab Initio EME 3.0, DQE(Data Profiler) 3.0 Elementum 3.1, Conduct>It 3.0, PDL, Autosys JIL Programming/Scheduling, SQL, PL/SQL, DBVisualizer 7.2, SQL Developer 3.2, Data Modeler 3.2, TOAD, UNIX Shell Scripting, CA Erwin Data Modeler 9.0, SAP Power Designer, Perl, AWK, SED, Python, J2EE, JSP, JMS Messages 2.0, JBOSS Messaging Services, JavaScript, JRun Server, Eclipse, XML, SVN, CVS, Sun OS 5.10, Linux, Crontab, SYBASE ASE 15.x, ORACLE 11g and Windows 2000/Windows 8.
As a System Analyst
Confidential, Danbury, CT
Responsibilities:
- Strong Understanding of the Source Application systems and Back ends, and analyzed the requirements and business rules for development/enhancement.
- Designed the Metreo Vision Application Vector Cubes for functional requirements and business rules.
- Analyzed and involved in the high-level design and prepared the Low-level Design (LLD).
- Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner.
- Optimized performance by tuning the Informatica ETL code as well as SQL
- Developed the Ab Initio Graphs, Metreo Vision Vector cubes, QlikView Reports, Dashboard bulletin and MIS Reports UNIX Shell Scripting using AWK, SED and Batch Job Processing and Scheduling by CRONTAB.
- Developed the SQL, PL/SQL Trigger, Procedures and functions on ORACLE and Teradata.
- Developed the Ab Initio Graphs for loading/unloading using FastExport, FastLoad, MultiLoad, Tpump utilities and BTEQ.
- Provided the end-user support for all the production issues, resolved the transactional data processing issues, configuration issues and worked with the vendor for application patch installation/deployments.
- Prepared the various documents for Process Design, Job Procedures and End-to-End Process document.
- Involved in deployment, patch installation for Metreo application vision in all the environments, and maintained the versioning control for all the developments and deployments.
- Tuning the performance and the prod issues with end users on data transactional process.
Environment: Informatica Power Center 7.2, Ab Initio GDE 2.14, Ab Initio Co>Operating System 2.14, Metreo Vision 3.2, QlikView 9.x, JAVA, JBOSS 4.0, Apache Server, XML, SQL, PL/SQL, Oracle 10g, Teradata 7.2, Toad 9.0, UNIX Shell Scripting, AWK programming, UNIX Cron Job Scheduling, SunOS 5.10 and Windows 2000 Professional.
Confidential, Richardson, TX
As Lead ETL Developer/Analyst
Responsibilities:
- Solid understanding of the source legacy systems and Backend (Oracle, IBM DB2 and Teradata).
- Analyzed the requirements and business rules for development/enhancement.
- Designed the architecture of system for functional requirements and business rules.
- Analyzed and involved in the High Level Design and Low Level Design (LLD).
- Analyzed the Database and Data modeling (Star Schema, Snowflake) as per requirements and business rules.
- Developed/enhanced the Ab Initio graphs (reusable and continuous flow) and Shell Wrapper Scripts (graph execution and Job schedule) and developed the generic/custom graphs for reusable functionalities.
- Developed/enhanced the PL/SQL stored procedures, triggers in ORACLE and Teradata.
- Tuned the performance on execution of graphs, Data Unload and High Volume Data Load (Teradata) and reviewed the graph using Ab Initio’s Review Checklist and EME standards.
- Scheduled the Ab Initio jobs using UNIX Crontab, Autosys and Control-M.
- Prepared the various documents for Job Scheduling Process Design, Job Procedures and End-to-End Process document for the Graphs, Wrapper script and Production Deployment document.
- Involved in deployment architectures, Trouble-shooting, Root-cause Analysis for each prod release and code version.
- Participated in the Production IT Problem Management Process using Standard Processes and Tools.
- Participated in Production Support Problem analysis - provide specific, relevant production information to assist in troubleshooting the problem as Production on call support.
Environment: Ab Initio GDE 1.13, Ab Initio Co>Operating system 2.13, Erwin Data Modeler7.x, COBOL, IMS, JCL, PL/SQL, Oracle 10g, IBM DB2 UDB, Teradata 5/2, Toad 7.6, UNIX Shell Scripting, Perl Scripting, Control-M/Autosys, ScriptIT, Express, HP-UX B.11.11, SunOS 5.9, IBM-AIX.2000.