We provide IT Staff Augmentation Services!

Sr Data Architect / Database Developer Resume

Hoffman Estates, IL

PROFESSIONAL SUMMARY:

  • Solid 16+ years of IT experience in numerous Enterprise level BI/DW Solutions, Mainframe, Client - Server, Mobile, Cloud Applications and Web-based Applications.
  • Expert in understanding the data and designing/Implementing the enterprise platforms like Hadoop Data lake and Huge Data warehouses.
  • Have over 2+ years of experience as Hadoop Architect with very good exposure to Hadoop technologies like HDFS, Map Reduce, Hive, Hbase, Sqoop, HCatalog, Pig, Zoo Keeper.
  • Proven expertise in all aspects of data management methods and practices including:
  • Design of DW schema + Re-engineering and reverse engineering + Data Profiling + Data Quality + Data architecture strategy + Data migration + Cloud Storage + Conversion with ETL tools + Performance tuning.
  • A Senior SQL Server performance tuning expert with practical experience from midsized databases to very large databases.
  • Hands on experience with SQL 2000/2005/2008/ R2/2012 including T-SQL programming, DTS, Performance tuning, Reporting, designing logical/physical databases and Troubleshooting.
  • Expert in SQL Server OLAP database development including KPIs, Data mining, working with changing dimensions.
  • Broad understanding of Data Warehousing and Business Intelligence Solutions; can use various software programs such as Oracle, DB2, Micro Strategy, Informatica, SQL Server Integration Services (SSIS) to process data, access different system resources, and in corporate T-SQL statements.
  • Extensively worked with data warehousing tools such as Oracle Warehouse Builder, MS SQL Server SSIS, DTS Packages, AbInitio, ETL Informatica in Oracle/ SQL Server environments.
  • Experienced in ETL implementation using SQL Server Integration Services (SSIS) in OLTP and Data Warehouse Environment.
  • Experienced in creating database objects and data marts such as tables, indexes, stored procedures, functions, triggers and writing complex queries.
  • Experience in IBM Mainframe Applications - Architecture, Design, Application development, Implementation, Maintenance, Enhancement and Testing.
  • In-depth knowledge on application development methodology with strong programming skills in VS Cobol II, MVS, JCL, VSAM, DB2, IMS, CICS, MQ Series, Teradata, REXX, Oracle 9i, SQL, PL-SQL.
  • Expertise in Mainframe tools like TSO/ISPF, QMF, SPUFI, CHANGE MAN, FILE-AID, XPEDITOR, INTERTEST, PLATINUM, MAINVIEW and also having Knowledge on schedulers like ESP and CA7.
  • Very good experience in SQL Server Integration Services(SSIS), SQL Server Analysis Services (SSAS) and SQL Server Reporting Services(SSRS)
  • Expert in generating writing parameterized queries, drill through reports and formatted SQL server Reports in SSRS 2005/ 2008/2008 R2/2012 using data from ETL Loads, SSAS Cubes and various heterogeneous data sources.
  • Senior level SQL Server 7/2000/2005/2008 Database Administration.
  • Expert in analyzing, designing, developing, installing, configuring and deploying MS SQL Server suite of products with Business Intelligence in SQL Server Reporting Services 2005/2008, SQL Server Analysis Services of 2005/2008 and SQL Server Integration Services.
  • Proficiency in MDX implementations, performance reporting, data modeling, data-mart, and data warehouse implementations.
  • Solid experience on Meta data definition, implementation and maintenance, new business rules identification and implementation to data rules, transformation program library maintenance, XML file generation.
  • Vast experience on Extraction, Transformation and Load (ETL) strategy, design and implementation. Includes ETL across different platforms such as Sybase, Teradata, Oracle, VSAM databases, excel and flat files.
  • Extensively worked on OLAP Data Analysis techniques - Drill Down and Roll Up, Slice and Dice. Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology.
  • Designed and developed the architecture for Data Warehousing components across subject areas and various technical domains (e.g. tool integration strategy, data massaging, data cleansing, data source ETL strategy, data staging, movement, and aggregation, information and analytics delivery and data quality strategy).
  • Worked closely with various customers/ staff to analyze the existing data sources for the purpose of integration with planned data frameworks.
  • Possess strong ER Modeling and Dimensional Modeling skills for OLAP, OLTP, Enterprise Data Management (EDM), T-SQL, NoSQL, DSS, ODS using various databases like Oracle 11g/10g/9i/8i, Sybase 12, SQL Server 7/ 2000/2005/2008/ 2012, Teradata V2R5/V2R6.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Hands-on experience on with cloud Storage - SQL Azure/Document DB/Table Storage/Blob Storage.
  • Driven to achieve goals and overcome obstacles, confident, highly energized, effective, and persuasive communicator with strong technical, and leadership skills.
  • Retail, Financial, Health Care services & other industry domain expertise in the following areas:
  • Retail Web and Mobile, Cloud Applications, E-Commerce, POS, Customer Order Management, Customer Experience Management, Web Analytics, Dental and ACA Health Plans, Credit Card, Debit Card, Trading, Anti-Money Laundering, Equity and Debt Events, Sales, Equities and Derivatives, Customer within Retail, Banking and Mortgage, Auto, Insurance applications data.

TECHNICAL SKILLS:

Operating Systems: UNIX, Linux, Windows NT/ 95/2000/XP/7, MVS

Scripting Languages: Unix Shell Scripting, PERL, ANSI SQL, T-SQL, PL/SQL, Focus, VB Script, Java Script, HTML, XML, XSL.

Mainframe: COBOL, Micro Focus COBOL, CLIST, DMS, IMS DB/DC, CICS, QLP, REXX, RPG, SQL, SAS, CA-Easytrieve, MVS JCL, VSAM, TSO/ISPF, File-Aid, QMF, SPUFI, Xpediter, MS Office, Control-M, CA-7

Data Manipulation tools: UNIX shells (ksh, bash, csh) and various scripting languages (Perl, awk) to high-end (Ab Initio) and low-end (MS SSIS) ETL tools, Oracle DBA Console, Enterprise Manager, TOAD, AQT, XML, SS2005 Management Studio, BIDS, IBM Control Center Expert

Version control/ tracking systems: PVCS, SourceSafe, Ab Initio EME, Rational Clear Quest, Subversion

Reporting Tools: Business Objects 6.5.1/XI R2/R3.1 SP2/SP3, Crystal Reports 8.5/9.0/10/XI R3.1 SP2/SP3, Xcelsius, Micro Strategy, Cognos, SSRS, SSAS, BIDS, Tableau, SharePoint, Report Builder 1.0/2.0/ 3.0

ETL Tools: Informatica Power Center 8/7.1.2/5.1.1/5.0. , Ab initio Cooperating system (2.14/2.15), SAS 9.3 and Oracle Warehouse Designer, DTS, SSIS

Business Intelligence tools: Desktop Intelligence, Reporter, Designer, Supervisor, Web Intelligence, Broadcast Agent and Publisher, Big Data

Data Bases: Oracle 8i/9i/10g/11g, Sybase 12.0, Warehouse Builder, File Maker Pro 5/7, MS Access, Oracle Forms, DB2, SQL Server 7.0/2000/2008/2012 , SQL Server Client 7.0/ 2000/2005/2008/ 2012, DB2, Big data, Teradata V2R6, SAS

Cloud technologies: Azure Cloud services, Storage Services, Azure Service bus concepts

DBA Process: NoSQL, My SQL

Database Processes: Archival procedures, Database tuning and designing and Database security and recovery

Database deployment tools: CVS, Clear Case, Puppet

Big data technologies: Hadoop, Map Reduce, Hive, HBase, Mahout, Sqoop, Pig, Zookeeper, Flume

Web, Client-Server and Desktop: ASP, JSP, MS Visual Interdev, VB6, .NET, XML, CSS, MS Office, MTS, COM, DCOM, and Web Logic Server

Quality Standards: SEI-CMM /CMMI, ISO 9000, PCMM Expert 15 Years

Data Modeling and other tools: ERWin 3.x/4.x (Data Model Mart), ER STUDIO 8.0/8.5/9/9.0.1 , Rational Rose, Sybase Power Designer, Smart Draw, ERD, TOAD 4.5, SQL Navigator 4.0, Data Stage

Projects management: Technical Lead, Project Leader, JIRA

PROFESSIONAL EXPERIENCE:

Confidential, Hoffman estates, IL

Sr Data Architect / Database developer

Responsibilities:

  • Enterprise Data Lake is the first Big data initiative in SHC to bring huge volumes of data from external systems into Hadoop environment. The architecture for the Enterprise Data Hub is based on Pivotal HD.
  • The Enterprise data lake used to provide support for various uses cases including Analytics, processing, storing and Reporting of voluminous, rapidly changing, structured and unstructured data.
  • Informatica 9.6 tools were used to inject the data into Data Lake. Map Reduce and Pig used for data processing and transformation.
  • Leading the development of data analytics practice, team and infrastructure as well as driving the overall strategic plan for data analytics practice.
  • Providing leadership for establishing enterprise level strategy for leverage and usage of data to gain business insights and drive company strategy and decisions.
  • Responsible for building the practice, processes and infrastructure for data analytics from ground up. Responsible for moving the organization to be a data driven organization.
  • Continually review the environment to challenge the assumptions around new sources of data, tools, talent, and potential partners.
  • Supported data cleansing, conversion and validation activities with research, analysis and tool identification. Worked to migrate the data to SAP S4/HANA.
  • Worked closely with business and IT Subject Matter Experts to analyze and resolve data issues.
  • Worked to optimize Local commerce search and Loyalty member purchasing behavior.
  • Developed and implemented Sears retail strategies in support of both UpTime Parts revenue and the Expand Customer Reach initiative for Sears Parts Direct portal.
  • Played role as staging and core databases. Exploring the feasibilities of spot fire, business objects and Tableau as the reporting tools.
  • Understanding the data nature from different OLTP systems and designing the injection processes for HDFS
  • Using Informatica 9.6 and Sqoop.
  • Working on Hadoop File formats TextInputFormat and KeyValueTextInputFormat
  • Designing data model on Hbase and Hive.
  • Creating Mapreduce jobs for Adhoc data requests.
  • Partitioning and Bucketing techniques in hive to improve the performance.
  • Optimizing Hive and Hbase queries.
  • Designing HBase column schemas.
  • Played key role to launch new iOS/Android app RELAY - used elastic search
  • Creating common data interface for Pig and Hive using Hcatalog.
  • Understanding the business requirements and needs and drawing the road map for Big data initiatives.
  • Driving POC initiatives for finding the feasibilities of different traditional and Big data reporting tools with the data lake Spotfire BO, Tableue etc
  • Scheduling big data jobs using the in-house scheduler Appworx.
  • Using Kerberos and LDAP security authentications.
  • Implementing POC for big data tools like Mahout, Impala etc.
  • Created logical and physical data models for OLTP and data warehouse systems.
  • Established Business applications road-maps within Enterprise platforms and applications.
  • Working with C-level executives (CEO, CFO, LOB Presidents) and business VPs, delivered product architecture and solutions for key strategic business initiatives in advanced digital marketing and big data.
  • Designed and developed NoSQL solutions for all users.
  • Provided expert level stored procedures development in Oracle, SQL Server and DB2 environments.
  • Provided strategic leadership for cross functional projects and rollouts.

Technologies: SQL Server 2008, 2008R2 and 2012, T-SQL, SQL Profiler, Oracle 11g, NoSQL, OLAP, OLTP, Teradata, DB2, Zeke scheduler, Azure cloud, VB Script, Data Warehousing, ER STUDIO 8.5/9.0 (Data Architect), Hadoop, Map Reduce, Hive, HBase, Mahout, Sqoop, Pig, Zookeeper, Flume, XML, Informatica 9.6, Microsoft Project/Office.

Confidential, River woods, IL

Senior Data Modeler

Responsibilities:

  • This position involved developing key aspects of the information architecture, including logical/physical data models, data flow diagrams and metadata strategy and data lineage mapping documents.
  • Facilitated JAD sessions to determine data rules and conducted Logical Data Modeling (LDM) and Physical Data Modeling (PDM) reviews with Data SMEs.
  • Participated in data analysis and data dictionary and metadata management -Collaborating with business analysts, SME’s, ETL developers, data quality analysts and database administrators.
  • Participated in design of Data Asset Management (DAM), Marketing, Customer Contact Channel, IVR, Targeted Payment Protection business processes, Confidential applications architecture and Confidential data model.
  • Prepared the complete data mapping for all the migrated jobs using SSIS.
  • Wrote complex queries in Teradata SQL assistant to check the data from Source and Confidential .
  • Used Teradata utilities (MLOAD & FLOAD) to load the source files in to test region. And did the querying on the same data.
  • Created databases and schema objects including tables, indexes and applied constraints, connected various applications to the database and written functions, stored procedures and triggers.
  • Developed relational model design, dimensional modeling, and ETL processes in Data warehouse (DWH) and Decision support system (DSS) environment. Validated and maintained the enterprise-wide logical data model for Data Staging Area, DWH and DSS.
  • Aided and verified DDL implementation by DBAs, corresponded and communicated data and system specifications to DBA, development teams, managers and users. Total number of entities 1170 including fact& multi dimensions.
  • Developed the logical data model (LDM) & PDM and assisted the DBA to create the physical database by providing the DDL scripts including Indexes, Data Quality checks scripts and Base View scripts.
  • Applied Discover Financial Data Naming Standards, created new naming standards, checked models in and out of ER Studio Model Repository, and documented data model translation decisions.
  • Performed data analysis and reverse engineering of data.
  • Provided data modeling support to the Marketing operations design team.
  • Ensured delivery of LDM, PDM, Staging PDM and data lineage mapping documents.
  • Created new macros in visual basic and modified existing macros to perform data validations.
  • Developed Business and Subject Area Logical Data Models, Physical Data Models, Physical Staging Area Data Models and Data Flow Diagrams.
  • Managed the data model changes in all enterprise applications. Performed impact analysis to ensure all systems leads are informed and coordinate efforts to implement changes.
  • Developed, integrated, managed and promoted enterprise level logical and physical data models and data model standards that source data from multiple origins (internal data sources, external data sources, third party data.
  • Worked extensively with DBA and other DAs on the integration of the physical data model and OBIEE RPD and metadata with the logical Data Model.
  • Generating RPD Meta-Data and worked with data governance team in integration of data in Meta center.
  • Developed detailed mapping documents that will provide the reporting and data warehouse team with source to Confidential data mapping includes logical names, physical names, data types, corporate meta-data definitions, and translation rules.
  • Conducted qualities control/auditing to ensure accurate and appropriate use of data.
  • Performed data analysis on production data issues.
  • Assisted in developing and publishing data management standards, policies and procedures, including the management and maintenance of related tools and environments.

Technologies: SQL Server 2005/2008, Oracle 10g, OLAP, OLTP, Teradata, DB2, VB Script, Multi-dimensional modeling, Data Warehousing, ER Win, ER/Studio, SQL, Sybase Power Designer, Hadoop, Map Reduce, Hive, HBase, My SQL workbench, Teradata, Microsoft Project/Office.

Confidential, Downers Grove, IL

Senior Data Modeler / Data Analyst

Responsibilities:

  • The position is a mix of data analyst and Data modeler. I am responsible for assisting the data modelers specifically and the project team in general in gathering and documenting of information required to build the Business Information Model (BIM).
  • Analyzed business requirements and translated them into detailed conceptual data models, process models, logical models, physical models and entity relationship diagrams. Gathered enterprise level data requirements.
  • Communicated data needs internally and externally to 3rd parties. Analyzed and understood the data options and documented data dictionary.
  • Designed and created Data Marts as part of a data warehouse. Effectively used triggers and stored procedures necessary to meet specific application’s requirements.
  • Captured data extract requirements for multiple subject areas sourced from an external ERP system.
  • Maintained existing ETL procedures, fixed bugs and restored software to production environment.
  • Analyzed data which included investigation and analysis, documentation, filtering of bad data through generation of crystal reports.
  • Designed 3rd normal form Confidential data model and mapped to logical model. Worked with Data Warehouse Extract and load developers to design mappings for Data Capture, Staging, Cleansing, Loading, and Auditing.
  • Developed, enhanced and maintained Snow Flakes and Star Schemas within data warehouse and data mart conceptual & logical data models.
  • Performed Statistical Data Analysis as per the client request.
  • Participated in system integration, historical load, incremental load and parallel testing.
  • Aided and verified DDL implementation by DBAs, corresponded and communicated data and system specifications to DBA, development teams, managers and users, such as the Web-based 120-entity Confidential Card Services (GMCS) using Oracle 10g, SQL Server 2000, Sybase and MS Access.
  • Established auditing procedures to ensure continued data integrity.
  • Researched data sources for new business data and better sources of data feeds.
  • Developed and coordinated enhancements and maintenance of data warehouse and data mart logical models.
  • Analyzed content and quality of data model(s) and databases.
  • Work closely with the DBA team to translate the model(s) from logical design to physical implementation.
  • Validated the data in ER/Studio materialized views.

Technologies: Data Warehouse, Data Modeling, Oracle warehouse builder, Data Architecture, Erwin, Windows XP Professional, MS SQL Server 2000, Oracle 10g, PERL, UNIX Shell Scripting, MS Access 2000, PL/SQL, T-SQL, TOAD, DB2, SQL Plus, Data Modal Mart, OLAP, ETL, Oracle, PL/SQL, ERWIN, MS Visio, Microsoft Tools, Java, J2EE

Confidential, NJ

Senior Business / Data Analyst

Responsibilities:

  • Determined source of data and designed and implemented extract, transformation and load (ETL) processes to populate data plans.
  • Assisted the Data Manager by leading and trained junior team members.
  • Created and maintained documentation and descriptors for data elements / attributes, and other required forms of metadata.
  • Created and manipulated data in support of application test efforts.
  • Utilized queries and scripts to mine existing data.
  • Created and maintained logical relational data models for subsequent translation into physical database models.
  • Worked closely with DBA to translate logical data models into physical data models.
  • Assisted with ad-hoc query and report requests to various clients.
  • Provided consultation and training to business users and developers in the efficient use of data reporting and ad-hoc tools.
  • Set standards and ensured that all processes follow the data methodology.
  • Worked with Business Analysts and Testers to analyze data requirements to support all phases of testing.

Technologies: SQL, Crystal Reports, Oracle, Test Director, TOAD, UNIX Shell scripting, MS Access, MS Excel, Visual Source Safe 6.0

Confidential, Charlotte, NC

Data Analyst / Team Lead

Responsibilities:

  • Involved in reviewing business and functional requirements.
  • Prepared test plans from the requirement documents.
  • Involved in installing and maintaining MS SQL Server 2000/2005.
  • Created Data Model for the Marketing Information System Team using Erwin.
  • BOA was implementing legacy data to SAP System and do the Data warehouse implementation for reporting purpose. The Data is related to sales and marketing which needed to bring back in Data Warehouse thru ETL Process.
  • Created SSIS packages for loading the data coming from various interfaces like OMS, Adjustments, Objectives and also used multiple transformation in SSIS to collect data from various sources.
  • Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • Worked on the data warehouse design and analyzed various approaches for maintaining different dimensions and facts in the process of building a data warehousing application.
  • Using SSAS created OLAP cubes for data mining and created reports from OLAP cubes using SSRS.
  • Created components, tools, techniques, methods and procedures used in an on On-Line Analytical Processing (OLAP) environment for accessing and translating data into understandable and usable business information using SSAS.
  • Data transformation and verification for various projects.
  • Managed and maintained Oracle and NoSQL databases in production domain.
  • Worked with other ETL developers in the Business Intelligence department contributing to large and small projects, customizing data warehouse solutions.
  • Designed and implemented a Master Record Management system for client, third-party, and internal systems’ source data, creating master data records, and publishing them to the CCR. The MRM system included staging file servers, a staging database, a file transfer process, ETL processes, and Master Record Management processes.
  • Conducted both 1:1 interviews with all organizational levels as well as participated in Joint Application/Joint Analysis Design sessions.
  • Responsible for the development of logical data models and physical data base designs applied across multiple computing environments.
  • Created black box test cases and highlighted requirement gaps through review meetings. Used Test Director for logging issues.
  • Developed white box test cases after review of technical design documents. Unit test results are verified for completeness. Early defects found were reported.
  • Involved in smoke test and in setting up the test environment.
  • Done extensive Integration testing and System Testing. Also performed end- end, regression and User Acceptance testing for new releases.
  • Used XPEDITER for manual batch testing. Used FILEAID for working with sequential and VSAM file handling and also used SPUFI for querying DB2 database.
  • Did postproduction support activity using CA7 and SAR for the newly developed production enhancements.
  • Identified JCL issues, data problems and reported to development team with possible solutions. Tracked the tickets in the Clear quest tool and followed it up to closer.
  • As an Offshore Project Leader, was responsible for the delivery and Quality of the deliverables and managing offshore team at India.
  • Created Debit Card BIN database with MS Access, VBA to facilitate extraction and storage of repeatable debit card data for test environment.

Environment: VS COBOL II, Microsoft Office, Teradata, HOGAN, DB2, JCL, UNIX,VB6, IMS/DB, CA7, PEM, SAR, MQ Series, SQL Server 7, Oracle 8i, VSAM, TSO, OS/2, ISPF, SDF, SCLM, FILEAID, XPEDITER, SQL server, C/Unix, DB2Connect, TAO, HTML, Test Director, Clear Quest, HTML, XML,MVS, VBA, MS ACCESS.

Confidential, Minneapolis, MN

Data Analyst

Responsibilities:

  • Reviewed Business and functional requirement documents. Identified project requirements and test software functionality.
  • Assisted in the execution of the application transition plan.
  • Created test plans and managed test plans using test director.
  • Conducted internal test plan and test case walkthroughs.
  • Contributed to and assist in implementing improvements in test methods and strategies.
  • Worked closely with developers to create and execute test scripts from specifications and product requirements.
  • Supported the design, considering business requirements and system limitations. Some coding was required with COBOL and DB2.
  • Created test data on mainframe system. Executed the test cases both manually and using mercury tools.
  • Performed system and regression test for new releases using win runner scripts.
  • Maintained traceability matrix for regression tests for software release builds.
  • Reviewed and contributed to software program documentation performed SQA monitor and audit the software development lifecycle making sure that the selected software development lifecycle is being followed and that
  • The processes defined above are being properly executed.
  • Worked with users and project teams to QA the documented use case scenarios
  • To ensure proper coverage of functionality and required test case scenarios
  • Performed database tests using SQL. Reported bugs and attended Bug review meetings.
  • Developed Rexx tools for quick data maintenance. Data was extracted from production to test environment and loaded to test DB2 table for edition to suit to the test standards.

Environment: COBOL II, C, DB2, JCL, IMS/DB, CA7, ISIM, OMVIEW, UNIX, NED, SCLM, Easytrieve, SAS, CICS, JMR, Niku

Confidential, Minneapolis, MN

Production Support Analyst

Responsibilities:

  • Monitored the daily batch schedule in all the 3 companies (Dayton’s, Confidential and Mervyn’s) using CA7.
  • Solved production abends and scheduling problems with the batch jobs.
  • Consulted with PSRs on the resolution of the production problems and coordinated with Ospec for the execution of problem resolution.
  • Analyzed existing programs in COBOL, DB2, Telon, C, IMS, CICS, Cool: Gen with performance tuning perspective.
  • Prepared low level specifications and did enhancements to the programs. Decreased run times of the many long running critical programs.
  • Did Analysis, Coding and Unit testing of service requests by prioritizing daily inflow of tickets using Tivoli Problem Management bucket.
  • Did Impact analysis for the new business strategies on the areas interacting with other systems.
  • Created very detailed interface data, batch flow diagrams with HTML links.
  • Fixed the data, scheduling problems and file fixing of production batch jobs for the smooth functioning of the business.
  • Reduced production problems from over 90 per month to less than 6 per month. Metrics were regularly collected and presented to the project management.
  • Involved in the team incorporated the Mervyn’s processing logic to Confidential ’s information systems so as to make it possible for both the companies to use one information system.
  • Developed Rexx tools for quick data maintenance.
  • Fixed Code for issues and wrote new modules using COBOL/ DB2, Cool: gen, COBOL/IMS, C, Easytrieve, Telon skills.
  • Did maintenance and support of 60 non-critical systems spanning all the three companies and introduced further enhancements to the existing functionalities.
  • Received one of the two yearly service awards for stabilizing APX system.
  • Provided training on accounts receivable, audit system’s business process for all involved in the team.

Environment: IBM ES9000, WINDOWS NT, MVS, OS/390, COBOL, C, JCL, PLATINUM, NED, DATAVANTAGE, CICS, REXX, EZTRIEVE, IMS DB, DB2, FOCUS, VSAM, Softech, Abendaid, JMR, Insync, TPM, CA7, Info/x, ONCALL, MVS/COOL: GEN 5.1, TSO/ISPF

Hire Now