Philosophy has been one of the pioneering disciplines with respect to the use of computers for searches of texts and of abstracts of publications. Current philosophical research involving use of computers centres on three areas. First, investigation is being undertaken on concept searches. These are sophisticated searches, for example of selected types of occurrence of clusters of terms, all of which express the same general concept. Second, extensive research is conducted on the generation of hypertext methods for displaying and relating arguments (expressed in natural language). The arguments are presented in formats and structures which reflect the philosophical elements between the arguments. Finally, the possibility of constructing philosophical expert systems is being explored. These are systems that will engage the user in argumentation at different levels of philosophical expertise.
The first use of computers in archaeology occurred in the 1960s and involved the application of statistical methods. It was associated with the emergence of new theoretical approaches to the study of prehistoric societies. From the early 1980s interest in the use of computer-based methods for academic archaeological research declined. This was the result of a growing disillusionment with the results achieved and of changing theoretical orientations within the discipline, which led to a rejection of what were perceived to be 'scientific' approaches. At the same time, the appearance of microcomputers made information technology (IT) techniques widely available for the first time to field archaeologists and other outside academic institutions. This led to a growth in the importance of mundane computer applications for recording excavations and post-excavation analysis. More recently there have been developments which have led to IT methods once again having a more central role in academic research. Geographic Information Systems provide new and more powerful way of performing traditional archaeological tasks, as well as new means of analysing spatial data. The use of multimedia techniques over the Internet has the potential to solve current major problems of research access to raw archaeological data. The creation of virtual reality reconstructions can offer new insights into experiencing the past. The renewed interest in computer-based modelling has begun to provide a rigorous way of looking at how micro-scale social and ecological processes give rise to complex and counter-intuitive outcomes.
This papers focuses on one kind of use of computers in music: as tools for the development of music theory, and in particular on computer-based research in tonal theory. Early work in formalizing tonal theory in quasi-axiomatic terms, and writing programs to test these formulations, has revealed that this approach, while producing creditable results, is unlikely to arrive at a flawless theory. Later work has used computers in statistical studies based on empirical psychological data, and in modelling putative brain processes using a neural-network paradigm. In both cases the notion of tonal axioms is abandoned in favour of a notion of listeners' sense of a potentially shifting and ambiguous 'tonal centre'. The latter part of the paper argues that the axiomatic approach should not be ignored, but that it should be reoriented so as to focus on the comparison of alternative theories and on the 'constructive' application of theoretical rules. In both, computers have the potential to be useful tools.
Some of the reasons for the limited use of information technology by historians are discussed. For his research on medieval and sixteenth century English political ideas, the author has developed both a prosopographical database of more than 2000 authors dealing with history and politics, giving information on their local origin, training, profession and textual production, and a corpus of machine-readable political texts of this period. Two examples of the application of factor analysis, one to the population of authors and the other to a sample of speeches in the English Parliament, are provided. The author's analysis began with punched cards and a main frame computer, and the problems caused by the rapid changes in technology are described. The computer has been used of necessity to investigate aspects not amenable to traditional historical research. It has provided a key, but to use it the historian has had to become part archivist, part social scientist.
The historian is a story teller who endeavours to relate an understanding of the past to the culture of the present. The rule-based pattern-seeking abilities of modern information technology tend to intervene in the tension between the historian's respect for the particularity of time, place and person and the need to generalize in the face of the rich profusion of information. The historiography of the 1832 Parliamentary Reform Act reveals a move from an understanding based upon the speeches and diaries of participants to one based upon voters and the act of voting. Poll books, as a source, have exactly those features of regularity and repetition so suited to Information Technology (IT). Several issues were raised by the survey, notably the importance of nominal record linkage and of the coding and categorization of information (and hence of social science perspectives), the relative failure of historians to use text analysis, the dangers of the self sufficient universe of machine-readable data and the problems of presenting relatively complex results. It is important that all historians are able to evaluate the authority of computer-based results. IT has changed major features of historical understanding and, through Internet and CD Rom technology, is changing the way in which many will study, research and enjoy seeking knowledge of the past.
Simulating the past: SOCSIM and CAMSIM and their applications in family and demography history
Richard Smith, Cambridge Group for the History of Population and Social Structure, University of Cambridge
This paper describes and reviews research in which computer-aided micro simulation techniques have been directed towards the understanding of demographic patterns, co-residential arrangements and kin sets in past societies. After a review of the techniques, two principal ventures by microsimulators of historical phenomena are considered. One area of emphasis appeared first in the late 1970s and early 1980s and was largely directed towards the use of simulation techniques as a means of improving scholars' comprehension of evidence, usually existing in the form of cross-sectional 'snap shots'. The other focus of research by microsimulators of past situations, currently under development, has been on the provision of information on features of the social structure that were and are largely unrecorded, but can be derived, within tolerable margins of error, provided sufficient information on the determining processes is available. Usually, however, there are no sources against which the simulated characteristics and patterns relating to the past can be compared. Examples of work directed to the understanding of family and kinship patterns in Ancient Roman society and the recent Chinese past are discussed.
Information technology should have much to offer linguistics, not only through the opportunities offered by large-scale data analysis and the stimulus to develop formal computational models, but through the chance to use language in systems for automatic natural language processing. The paper discusses these possibilities in detail, and then examines the actual work that has been done. It is evident that this has so far been primarily research within a new field, computational linguistics, which is largely motivated by the demands of and interest in practical processing systems, and that information technology has had rather little influence on linguistics at large. There are different reasons for this, and not all good ones: information technology deserves more attention from linguists.
This paper focuses on the role of computer-based simulation of three-dimensional models of theatrical space, based on specific examples, ranging from temporary Roman stages, through permanent Roman theatres, to 18th century London theatres, culminating in the work in progress to restore the Hellerau Festspielhaus near Dresden as a venue for advanced experimental performance. The raw materials for the three-dimensional models range from two-dimensional paintings, site plans and fragmentary elements to the surviving buildings themselves. Particular attention is given to Rome's first permanent theatre, the largest the Romans ever built and now largely hidden by later building in the centre of Rome, making excavation impossible and three-dimensional simulation the only feasible approach.
Click on images for larger versions
A computer-generated 3D depiction of the architectural elements on a painting from the Villa of Oplontis near Pompeii
A temporary replica stage built at the J. Paul Getty Museum
Computer 3D model of a Rhythmic Space
After a review of the resistance of art historians to computerisation, the necessity of such a move is recognized. Art history is unique in terms of imagery, in that the image is its text. This poses two problems - the importance of the quality of the image and the method of codification, since there appears no discrete equivalent to the letter, number or musical note. However, the process of ditigization does produce such an equivalent, the individual picture element (pixel). This process provides stability, and the product is readily transferable and manipulable, permitting the analyses of image. One problem for the scholar is access to digital sources because of copyright and the relative lack of resources of good quality. Attempts are being made to develop expert systems, but one system described, though successful, is very specific in its application. A possible complementary 'bottom up' approach is represented by the author's Morelli system, which provides a simple unique identifier for each image, thus facilitating searches of catalogues of digital imagery. The paper also explores the role of multi-media and the Internet, the latter particularly for access to digital sources and for communication between scholars. Digital sources have the potential for formal and structural analysis and hence the consideration of form, a return to the original emphasis in art history, replacing the modern tendency to see images purely iconically.
Click on image for larger version
Detail of a Vasari high resolution digital reproduction of the Arnolfini Portrait
The paper summarizes and explains the impact that information technology (IT) has had so far on the conduct of two traditional branches of legal scholarship: legal exposition and legal studies. Although it is suggested that the impact to date has been slight, it is claimed that affordable emerging technologies and shifting attitudes will result, in years to come, in far greater usage in both these branches of legal scholarship. It is also argued that IT will itself bring about a fundamental shift in paradigm in the provision of legal services, a shift which will radically transform the emphasis and scope of legal exposition and will raise fundamental issues for legal studies about the nature of law. This shift in paradigm is clarified in two ways: first, by reference to the changes in the information substructure in society, as progress is made from a print-based industrial society to an IT-based information society; and, second, in the context of movement along a legal information continuum on which information technologies are positioned between written legal sources and human legal experts. The paper concludes by demonstrating the limitations inherent in any legal information system.
To assess the impact of information technology in the social sciences, it is argued that generic soft technologies and generic theories should be sought and identified. This argument is illustrated by four examples: accounts in demography and economics, health service delivery, urban development and site identification in archaeology. It is argued that the potential of these developments has not been fully realized because social science research has been under-funded rather than properly treated as 'big' science.
Computer simulation is not just a new method to add to the social researcher's armoury, but a new way of thinking about society, and especially social processes. Conventional methods have some difficulty in investigating social dynamics and in testing theories of social processes. Recent advances in computer technology have for the first time made it possible to carry out complex simulations and this has given rise to a burgeoning interest in the opportunities for theoretical and methodological developments in simulation within the social sciences. This paper reviews a variety of examples of present day simulation studies taken from anthropology, social psychology and economics as well as sociology. It identifies some theoretical ideas that have been inspired by simulations and considers some methodological issues applicable not only to simulation but to the social sciences in general.
Cognitive psychology is a large sub-domain of psychology concerned with identifying and understanding the processes which underlie human cognition. It developed following a paradigm shift in the 1950s when psychologists abandoned 'Behaviourism' with its stimulus-response models of behaviour and became interested in the internal processes and representations involved in human information processing. The introduction of computers provided a powerful analogy for these mental operations. Thus the introduction of information technology (IT) had a profound impact on the way psychologists modelled behaviour. In the 1980s there were challenges to the computational metaphor. Psychologists who adopt a 'Connectionist approach' claimed that cognition should be explained in terms of interactions between connected neural-like elements. Models should correspond to the neural processes of the human brain. Connectionists use powerful computer simulation techniques to model human information processes and regard this as a second paradigm shift based upon the appropriate use of IT. This paper will concentrate on a third aspect of the relationship between IT and psychology: the desire to understand the way in which human users interact with the technology, its impact upon human activities such as communication and decision making, and how the technology should be designed to facilitate these processes. The goals of this 'Cognitive Engineering' approach will be illustrated in terms of developing design guidelines for technologies and understanding human behaviour in new technologically-mediated milieus.
After some general reflections on the differences that information technology (IT) has made to the practice of applied research in economics, the main focus of this paper is on the use of microdata in policy simulation models. These models are in principle accessible to a wide range of users and it is important that their advantages and limitations are fully understood. Some of the issues are illustrated by an application of the Microsimulation Unit's model, POLIMOD. The widespread use of such models is potentially threatened unless a new framework for the regulation of the use of microdata can be developed. Advances in IT have meant that the distinctions between data and other model components are increasingly blurred. A new approach to data access needs to take account of the increased integration of the processes involved in data analysis and policy evaluation.
The paper discusses the evolution of a software system, and its influence on econometric practice. The main trends are from essentially 'techniques' to primarily graphical tools; a formalization of the underlying econometric modelling methodology; and a greatly increased emphasis on the user interface. Software has not only enabled vast calculations to be made that previous researchers deemed infeasible and allowed graphical presentation to render the resulting mass of output comprehensible, it has also altered the way in which econometric modelling is practiced, substantially improving the quality of empirical research. The present software exploits the Windows operating system to show data, results, evaluations, and graphics simultaneously on screen, thereby further facilitating econometric analyses.
The paper describes some of the ways by which High Performance Computing (HPC) hardware can be used in human geographical research. There is a discussion of the potential and also of the impediments facing the wider use of HPC in geography and the more generally in the social sciences. It is argued that supercomputing is now an extremely useful paradigm for geographical research and a new term, GeoComputation, has recently been invented to represent the adoption of a computationally intensive approach. The paper illustrates some of the possibilities via a number of case-studies based on use of the Cray T3D parallel supercomputer at Edinburgh.
Social anthropologists are specialists in local culture, a task which has traditionally involved the study of communities in the field, generally in locations that are remote from the bases where the results will be analysed and described. The introduction of electronic techniques for collecting data has had several different effects on research techniques, although (as in other disciplines) developing these has been the work of a self-selected minority of dedicated individuals. Most changes have affected the anthropologist's 'toolbox' for collecting information in the field, in terms of what is recorded and in what form; others have worked to reduce the isolation of the individual researcher in the field. The impact of Information Technology (IT) has been greater in some fields, notably in visual anthropology, through the making of films, the use of multi-media techniques in both teaching and research, and the application of computing for the analysis of the material collected. At the same time, the impact of IT on global communications has affected the meaning of local culture in the communities being studied and in the approaches used by anthropologists to seek the cooperation of communities in securing the information they require; in particular, the development of IT skills within such communities has led some anthropologists to adopt a supporting role. All this has taken place against a background of debate among social anthropologists on their commitments to governments and to the communities being studied, which provides a context against which the impact of IT can be assessed.
Networked communication has become one of the dominant forces for change. This paper describes some of the impacts of networked communication on scholarship and draws attention to dangers that its rapid uptake may pose to scholarship. The long-term influence of networking on scholarship will depend in part on the increase in the quantity, diversity and quality of information resources available in digital form. The digital medium provides opportunities that print could never support, such as facilities to use a variety of data types and structures within a single scholarly work; new access and distribution models; and the capability to reanalyse dynamically data sets used in support of scholarly arguments. It poses institutional, sociological and technical obstacles. For example, the proliferation of resources in digital form requires that they be preserved, and doing so requires co-ordinated and comprehensive planning.