Perspectives of Information Systems / Edition 1 available in Hardcover
Perspectives of Information Systems / Edition 1
- ISBN-10:
- 0387987126
- ISBN-13:
- 9780387987125
- Pub. Date:
- 06/24/1999
- Publisher:
- Springer New York
- ISBN-10:
- 0387987126
- ISBN-13:
- 9780387987125
- Pub. Date:
- 06/24/1999
- Publisher:
- Springer New York
Perspectives of Information Systems / Edition 1
Hardcover
Buy New
$71.99-
SHIP THIS ITEM— Temporarily Out of Stock Online
-
PICK UP IN STORE
Your local store may have stock of this item.
Available within 2 business hours
Temporarily Out of Stock Online
Overview
Product Details
ISBN-13: | 9780387987125 |
---|---|
Publisher: | Springer New York |
Publication date: | 06/24/1999 |
Edition description: | 1999 |
Pages: | 271 |
Product dimensions: | 6.10(w) x 9.25(h) x 0.03(d) |
Table of Contents
I Methodologies and Metamodels for Information System Development and Their Evaluation.- 1. Framework for Information Activities.- 1.1. Motivation.- 1.2. Foundations of the Framework.- 1.2.1. Information System and Its Environment.- 1.2.2. Principles of Level Construction.- 1.3. IST, the Fundamental Conceptual Component of Information Systems.- 1.4. Levels of Information Activity.- 1.4.1. Hierarchical Structure.- 1.4.2. IS Use Level.- 1.4.3. ISD Level.- 1.4.4. ISD Model Construction Level.- 1.4.5. Theory Development Level.- 1.4.6. Interpretations of the Framework.- 1.5. Evaluation.- 1.5.1. Framework for Research.- 1.5.2. Frame of Reference for Tools.- 1.6. Conclusions.- 2. Analysis of Three IS and ISD Reference Frameworks.- 2.1. Related Works.- 2.2. Definitions and Concepts.- 2.3. A Short Description of Three Reference Frameworks.- 2.3.1. Framework for Understanding.- 2.3.2. HECTOR Framework of Reference.- 2.3.3. FRISCO Framework.- 2.4. Scheme for Evaluating Metamethodologies.- 2.4.1. Internal Validity of Frameworks.- 2.4.2. External Validity of Frameworks.- 2.4.3. Coverage of Frameworks.- 2.5. Comparison Results.- 2.5.1. Internal Validity of Frameworks.- 2.5.2. External Validity of Frameworks.- 2.6. Conclusions.- II Contingency Factors and Uncertainty in Decision Making in Information System Development.- 3. Favorable Atmosphere for Effective Information Technology Decisions.- 3.1. Basic Assumptions.- 3.2. MISD Contingency Factors.- 3.3. Main Information Technology Decisions.- 3.4. Priority Setting and Other Strategic Decisions.- 3.5. Feasibility Study, Contingency Analysis, and Methodology Selection.- 3.6. MIS Implementation Decisions.- 3.7. Introduction of the New MIS and Maintenance Decisions.- 3.8. Conclusions.- III Development of Holistic ISD Methodologies and Selection of ISD Methods and Tools.- 4. Overview of the OSSAD Methodology.- 4.1. Principles, Functions, and Approach.- 4.2. Main Principles.- 4.3. Approach.- 4.3.1. Set Contract.- 4.3.2. Analyze Situation.- 4.3.3. Design System.- 4.3.4. Implement Changes.- 4.3.5. Monitor System Performance.- 4.4. Management Issues.- 4.4.1. Organization.- 4.4.2. Procedures for Getting Under Way.- 4.5. Modeling.- 4.6. Language.- 4.6.1. Abstract Model.- 4.6.2. Descriptive Model.- 4.6.3. Specification Models.- 4.7. Conclusions.- 5. Refinement of the OSSAD Methodology by Multiclient Field Testing.- 5.1. Overview.- 5.2. Framework for Research Work.- 5.2.1. Research Objective.- 5.2.2. Research Setting.- 5.3. Research Methods.- 5.3.1. Study Methods.- 5.3.2. Experimental Methods.- 5.3.3. Action Research.- 5.3.4. Methodological Pluralism.- 5.4. Field Test Practice in the Development of OSSAD Methodology.- 5.4.1. The Application of the 7S-Frame to the OSSAD Field Test Practice.- 5.4.2. The Contextual Framework of the Analysis.- 5.4.3. Sequence of Methodological Steps.- 5.4.4. Use and Relevance of Instruments and Procedures, Models and Concepts.- 5.4.5. Strategic Aspects and Findings on the OSSAD Methodology.- 5.5. Some Results of Other OSSAD Field Testings.- 5.6. Conclusions.- 6. Technical Specification of an Information System.- 6.1. Technical Information System Specification.- 6.2. Information System Structure and User Interface Specification.- 6.3. Software Specification.- 6.4. Specification of Files, Databases, and Knowledge Bases.- 6.5. Specification of Data Media, Hardware, and Other Facilities.- 6.6. Specification of Systems Interconnections.- 6.7. Specification of Information System Quality and Control Features.- 6.8. Framework for Technical Specification Process.- 6.9. Conclusions.- 7. Decision Criteria for Information System Development Tool Selection.- 7.1. Types of Tools for Supporting ISD Processes.- 7.2. Reasons for Acquiring a Tool.- 7.3. Issues in Selection Processes.- 7.4. Rating Selection Criteria.- 7.5. Computerized Support of the Selection Process.- 7.6. Conclusions.- IV Toward Intelligent Executive Information Systems.- 8. Strategic Decision Making.- 8.1. Motivation for Classifying Decision Problems.- 8.2. Classification of Strategic Decision-Making Problems.- 8.3. Analysis of Information Requirements.- 8.4. Decisions Supported by Knowledge-Based Technology.- 8.5. Conclusions.- 9. Evaluation of Executive Information Systems.- 9.1. Acquiring an EIS Product to Match the Managerial Requirements.- 9.2. Development of Mangerial Support Systems.- 9.2.1. Management Information Systems.- 9.2.2. Decision Support Systems.- 9.2.3. Executive Information Systems.- 9.2.4. Generalization Trends in Information Systems for Managerial Use.- 9.3. Framework for Evaluating EIS Products.- 9.4. Functional Capabilities of EIS Products.- 9.5. Qualitative Properties of EIS Products.- 9.6. Technical Properties of EIS.- 9.7. Cost Issues.- 9.8. User’s Experiences.- 9.8.1. Utilization of the Functional Properties of EIS.- 9.8.2. Utilization of the Qualitative Properties of EIS.- 9.8.3. Utilization of the Technical Properties of EIS.- 9.9. Issues in Construction, Introduction, and Use of EIS.- 9.10. Conclusions.- 10. Application of Knowledge-Based Technology in Executive Information Systems.- 10.1. The Diversity of Decision Support Systems.- 10.2. DSS, EIS, and ESS.- 10.2.1. Decision Support Systems.- 10.2.2. Executive Information Systems.- 10.2.3. Executive Support Systems.- 10.3. IESS as an Integration of ES/KBS Technology to EIS and DSS.- 10.3.1. Expert Systems and Knowledge-Based Systems.- 10.3.2. Intelligent Decision Support Systems.- 10.3.3. Intelligent Executive Support Systems.- 10.4. Structure of Intelligent Executive Support Systems.- 10.4.1. The Interface.- 10.4.2. Office Support Subsystem.- 10.4.3. Database Subsystem.- 10.4.4. Model Base Subsystem.- 10.4.5. Knowledge Base Subsystem.- 10.4.6. Work Space.- 10.5. Conclusions.- V Mobile Information Systems.- 11. Executives’ Views of Mobile Information Services.- 11.1. Concepts, Research Objectives, and Research Methods.- 11.1.1. Basic Concepts.- 11.1.2. Research Objectives.- 11.1.3. Research Methods.- 11.2. Executives’ Work.- 11.2.1. Main Characteristics of Executives’ Work.- 11.2.2. Trends.- 11.2.3. Our Case Studies of Executives.- 11.2.4. Time Distribution of Executives’ Work.- 11.3. Information Technology Support for Executives’ Tasks.- 11.3.1. Personal Work Support.- 11.3.2. Mail and Communication Services.- 11.3.3. Information Services.- 11.3.4. Office Support Services.- 11.3.5. Analytical Support Services.- 11.3.6. Other Services.- 11.3.7. IT Use Experiences and the Executives’ Tasks.- 11.4. Trends in Information Technology Support for Executives.- 11.4.1. Value-Added Services.- 11.4.2. Executives’ Expectations of Mobile IT Services.- 11.5. Conclusions.- VI Exception Handling in Information Systems.- 12. Basic Concepts of Exception Handling in Office Information Systems.- 12.1. Event Handling.- 12.2. Dynamic Nature of Office Information Systems.- 12.2.1. Office.- 12.2.2. Rules.- 12.2.3. Exceptions.- 12.3. Characteristics of Exceptionality.- 12.3.1. Severity Classes of Exceptions.- 12.3.2. Frequencies of Exceptions.- 12.3.3. Organizational Influence of Exceptions.- 12.3.4. Reasons for Exceptions.- 12.4. Exception Handling Principles.- 12.4.1. Event Handling.- 12.4.2. Action Analysis.- 12.4.3. Handling of Established Exceptions.- 12.4.4. Handling of Otherwise Exceptions.- 12.4.5. Handling True Exceptions.- 12.5. Exception Handling in Organizations.- 12.5.1. Levels of Information Systems.- 12.5.2. Exception Handling Practices.- 12.5.3. Consequences of Exception Occurrence.- 12.6. Conclusions.- VII Quality Assurance and Performance Evaluation of Information Systems.- 13. Concepts and Practices in Performance Evaluation of Office Information Systems.- 13.1. System Performance and Changing User Preferences.- 13.2. Concepts and Criteria.- 13.2.1. Main Concepts.- 13.2.2. Evaluation Criteria in OSSAD Models.- 13.2.3. Evaluation of the OIS Development Process and Its Results.- 13.2.4. Organizational, Social, Economic, and Technical Criteria for the Evaluation.- 13.3. Framework for Dynamic OIS Evaluation.- 13.4. Evaluation Practices and User Participation.- 13.5. Conclusions.- 14. Analysis of the Dynamic Nature of Information Systems Performance Evaluation.- 14.1. Theoretical Background and Research Issues.- 14.1.1. Theoretical Foundations and Research Methodology.- 14.1.2. Research Issues.- 14.2. Evaluation from the Systems Developers’ Viewpoints.- 14.3. Changing Interest in Systems Evaluation Through the ISLC.- 14.4. Changing Evaluation Criteria Throughout the ISLC.- 14.5. Conclusions.- 15. Performance Evaluation of an Information System: An Experiment.- 15.1. Background for Our Case Studies.- 15.2. Performance Evaluation Process.- 15.3. Performance Evaluation Criteria and Measurement.- 15.4. Performance Evaluation over the ISLC.- 15.5. Some Evaluation Experiments.- 15.6. Conclusions.