Laboratory Computer (eBook)
354 Seiten
Elsevier Science (Verlag)
978-0-08-052155-8 (ISBN)
The Laboratory Computer: A Practical Guide for Physiologists and Neuroscientists introduces the reader to both the basic principles and the actual practice of recording physiological signals using the computer. It describes the basic operation of the computer, the types of transducers used to measure physical quantities such as temperature and pressure, how these signals are amplified and converted into digital form, and the mathematical analysis techniques that can then be applied. It is aimed at the physiologist or neuroscientist using modern computer data acquisition systems in the laboratory, providing both an understanding of how such systems work and a guide to their purchase and implementation. - The key facts and concepts that are vital for the effective use of computer data acquisition systems- A unique overview of the commonly available laboratory hardware and software, including both commercial and free software- A practical guide to designing one's own or choosing commercial data acquisition hardware and software
Front Cover 1
The Laboratory Computer 4
Copyright Page 5
Contents 10
Series Preface 6
Preface 8
Chapter One. Introduction 14
1.1 The rise of the laboratory computer 15
1.2 The data acquisition system 17
1.3 Analysing digitised signals 20
1.4 Analysis of electrophysiological signals 21
1.5 Image analysis 23
1.6 Software development 23
1.7 Summary 24
Chapter Two. The Personal Computer 25
2.1 Computer families 25
2.2 Main components of a computer system 27
2.3 The central processing unit 30
2.4 Random access memory 32
2.5 Cache memory 33
2.6 Motherboards 34
2.7 Magnetic disc storage 35
2.8 Removable disc storage 38
2.9 Interface buses and expansion slots 40
2.10 Input devices 42
2.11 Video displays 43
2.12 Peripheral device interfaces 46
2.13 Printers and other output devices 48
2.14 Operating systems 50
2.15 Computer networks 56
2.16 Further reading 57
Chapter Three. Digital Data Acquisition 58
3.1 Digitising analogue signals 59
3.2 The Nyquist criterion 60
3.3 The A/D converter 61
3.4 The laboratory interface unit 65
3.5 Laboratory interface–host computer connections 67
3.6 Laboratory interfaces and suppliers 72
3.7 Recording modes 80
3.8 Data acquisition software 83
3.9 Choosing a data acquisition system 85
3.10 Further reading 86
Chapter Four. Signal Conditioning 87
4.1 Amplifiers 88
4.2 Analogue filtering 96
4.3 Event detectors 101
4.4 Signal conditioners 102
4.5 Interference and its elimination 105
4.6 Stimulators 110
4.7 Further reading 112
Chapter Five. Transducers and Sensors 114
5.1 Basic transducer properties 115
5.2 Temperature transducers 117
5.3 Light detectors 120
5.4 Force transducers 128
5.5 Pressure transducers 134
5.6 Chemical Sensors 138
5.7 Further reading 148
Chapter Six. Signal Analysis and Measurement 149
6.1 Signal measurement 149
6.2 Basic waveform characteristics 153
6.3 Signal averaging 156
6.4 Digital filters 158
6.5 Frequency domain analysis 160
6.6 Curve fitting 168
6.7 Analysis of random distributions 181
6.8 Further reading 184
Chapter Seven. Recording and Analysis of Intracellular Electrophysiological Signals 185
7.1 Origin of bioelectrical signals 186
7.2 Cell equivalent circuits 187
7.3 Intracellular recording techniques 188
7.4 The intracellular data acquisition system 192
7.5 Experimental paradigms 197
7.6 Analysis of voltage-activated currents 199
7.7 Analysis of synaptic signals 207
7.8 Single-channel currents 216
7.9 Noise analysis 227
7.10 Cell capacitance 234
7.11 Further reading 238
Chapter Eight. Recording and Analysis of Extracellular Electrophysiological Signals 239
8.1 Extracellular potentials 239
8.2 Recording electrodes 241
8.3 Electromyography 242
8.4 Electrocardiography 244
8.5 Electroencephalography 251
8.6 Recording activity of single neurons 254
8.7 Analysis of neural spike trains 262
8.8 Neural signal acquisition systems 268
8.9 Further reading 273
Chapter Nine. Image Analysis 274
9.1 Digitisation of images 275
9.2 Image acquisition devices 280
9.3 Charge-coupled devices 280
9.4 CCD readout architectures 282
9.5 CCD performance 283
9.6 Electronic cameras 284
9.7 Analogue video signal formats 285
9.8 Analogue video cameras 287
9.9 Camera performance specifications 287
9.10 Digitising analogue video signals 289
9.11 Digital cameras 291
9.12 Digital frame grabbers 294
9.13 Scanners 294
9.14 Confocal microscopy 297
9.15 Image analysis 299
9.16 Image calibration 301
9.17 Image arithmetic 303
9.18 Spatial filtering 304
9.19 Image analysis software 304
9.20 Analysis of moving images 307
9.21 Three-dimensional imaging 309
9.22 Further reading 310
Chapter Ten. Software Development 312
10.1 Computer programs 313
10.2 Assembler code 313
10.3 Programming language features 314
10.4 User interface design 319
10.5 Software development tools 321
10.6 Visual Basic 321
10.7 Borland Delphi 326
10.8 Visual C++ 328
10.9 Multiplatform software development 330
10.10 Matlab 330
10.11 LabVIEW 334
10.12 Choosing a development system 336
10.13 Further reading 338
References 339
Suppliers 350
Index 354
Introduction
The computer now plays a central role in the laboratory, as a means of acquiring experimental data, analysing that data, and controlling the progress of experiments. An understanding of it and the principles by which experimental data are digitised has become an essential part of the (ever lengthening) skill set of the researcher. This book provides an introduction to the principles and practical application of computer-based data acquisition systems in the physiological sciences. The aim here is to provide a coherent view of the methodology, drawing together material from disparate sources, usually found in highly compressed form in the methods sections of scientific papers, short technical articles, or in manufacturers’ product notes.
An emphasis is placed on both principles and practice. An understanding of the principles by which the physiological systems one is studying are measured is necessary to avoid error through the introduction of artefacts into the recorded data. A similar appreciation of the theoretical basis of any analysis methods employed is also required. Throughout the text, reference is therefore made to the key papers that underpin the development of measurement and analysis methodologies being discussed. At the same time, it is important to have concrete examples and to know, in purely practical terms, where such data acquisition hardware and software can be obtained, and what is involved in using it in the laboratory. The main commercially available hardware and software packages used in this field are therefore discussed along with their capabilties and limitations. In all cases, the supplier’s physical and website address is supplied. A significant amount of public domain, or ‘freeware’, software is also available and the reader’s attention is drawn to the role that this kind of software plays in research.
Physiology – the study of bodily function and particularly how the internal state is regulated – more than any other of the life sciences can be considered to be a study of signals. A physiological signal is the time-varying changes in some property of a physiological system, at the cellular, tissue or whole animal level. Many such signals are electrical in nature, cell membrane potential and current for instance, or chemical such as intracellular ion concentrations (H+, Ca++). But, almost any of the fundamental physical variables – temperature, force, pressure, light intensity – finds some physiological role. Records of such signals provide the raw material by which an understanding of body function is constructed, with advances in physiology often closely associated with improved measurement techniques. Physiologists, and particularly electrophysiologists, have always been ready to exploit new measurement and recording technology, and the computer-based data acquisition is no exception.
1.1 THE RISE OF THE LABORATORY COMPUTER
Computers first started to be used in the laboratory about 45 years ago, about 10 years after the first digital computer, the ENIAC (Electronic Numerical Integrator And Calculator), had gone into operation at the University of Pennsylvania. Initially, these machines were very large, room-size devices, seen exclusively as calculating machines. However, by the mid-1950s laboratory applications were becoming conceivable. Interestingly enough, the earliest of these applications was in the physiological (or at least psychophysiological) field. The Whirlwind system developed by Kenneth Olsen and others at Massachusetts Institute of Technology, with primitive cathode ray tube (CRT) display systems, was used for studies into the visual perception of patterns associated with the air defence project that lay behind the funding of the computer (Green et al., 1959). The Whirlwind was of course still a huge device, powered by vacuum tubes, and reputed to dim the lights of Cambridge, Massachusetts when operated, but the basic principles of the modern laboratory computing could be discerned. It was a system controlled by the experimenter acquiring data in real time from an experimental subject and displaying results in a dynamic way.
Olsen went on to found Digital Equipment Corporation (DEC) which pioneered the development of the minicomputer. Taking advantage of the developments in integrated circuit technology in the 1960s, minicomputers were much smaller and cheaper (although slower) than the mainframe computer of the time. While a mainframe, designed for maximum performance and storage capacity, occupied a large room and required specialised air conditioning and other support, a minicomputer took up little more space than a filing cabinet and could operate in the normal laboratory environment. Clark & Molnar (1964) describe the LINC (Laboratory INstrument Computer), a typical paper-tape-driven system of that time (magnetic disc drives were still the province of the mainframe). However, it could digitise experimental signals, generate stimuli, and display results on a CRT. The DEC PDP-8 (Programmable Data Processor) minicomputer was the first to go into widespread commercial production, and a variant of it the LINC-8 was designed specifically for laboratory use. The PDP-8 became a mainstay of laboratory computing throughout the 1960s, being replaced by the even more successful PDP-11 series in the 1970s.
Although the minicomputer made the use of a dedicated computer within the experimental laboratory feasible, it was still costly compared to conventional laboratory recording devices such as paper chart recorders. Consequently, applications were restricted to areas where a strong justification for their use could be made. One area where a case could be made was in the clinical field, and systems for the computer-based analysis of electrocardiograms and electroencephalograms began to appear (e.g. Stark et al., 1964). Electrophysiological research was another area where the rapid acquisition and analysis of signals could be seen to be beneficial. H.K. Hartline was one of the earliest to apply the computer to physiological experimentation, using it to record the frequency of nerve firing of Limulus (horseshoe crab) eye, in response to a variety of computer-generated light stimuli (see Schonfeld, 1964, for a review).
By the early 1980s most well-equipped electrophysiological laboratories could boast at least one minicomputer. Applications had arisen, such as the spectral analysis of ionic current fluctuations or the analysis of single ion channel currents, that could only be successfully handled using computer methods. Specialised software for these applications was being developed by a number of groups (e.g. D’Agrosa & Marlinghaus, 1975; Black et al., 1976; Colquhoun & Sigworth, 1995; Dempster, 1985; Re & Di Sarra, 1988). The utility of this kind of software was becoming widely recognised, but it was also becoming obvious that its production was difficult and time consuming. Because of this, software was often exchanged informally between laboratories which had existing links with the software developer or had been attracted by demonstrations at scientific meetings. Nevertheless, the cost of minicomputer technology right up to its obsolescence in the late 1980s prevented it from replacing the bulk of conventional laboratory recording devices.
Real change started to occur with the development of the microprocessor – a complete computer central processing unit on a single integrated circuit chip – by Intel Corp. in 1974. Again, like the minicomputer in its own day, although the first microprocessor-based computers were substantially slower than the contemporary minicomputers, their order-of-magnitude lower cost opened up a host of new opportunities for their use. New companies appeared to exploit the new technology, and computers such as the Apple II and the Commodore PET began to appear in the laboratory (examples of their use can be found in Kerkut, 1985; or Mize, 1985). Not only that; computers had become affordable to individuals for the first time, and they began to appear in the home and small office. The era of the personal computer had begun.
As integrated circuit technology improved it became possible to cram more and more transistors on to each silicon chip. Over the past 25 years this has led to a constant improvement in computing power and reduction in cost. Initially, each new personal computer was based on a different design. Software written for one computer could not be expected to run on another. As the industry matured, standardisation began to be introduced, first with the CP/M operating system and then with the development of the IBM (International Business Machines) Personal Computer in 1981. IBM being the world’s largest computer manufacturer at the time, the IBM PC became a de facto standard, with many other manufacturers copying its design and producing IBM PC-compatible computers or ‘clones’. Equally important was the appearance of the Apple Macintosh in 1984, the first widely available computer with a graphical user interface (GUI), which used the mouse as a pointing device. Until the introduction of the Macintosh, using a computer involved the user in learning its operating system command language, a significant disincentive to many. The Macintosh, on the other hand, could be operated by selecting options from a...
Erscheint lt. Verlag | 2.7.2001 |
---|---|
Sprache | englisch |
Themenwelt | Mathematik / Informatik ► Informatik ► Theorie / Studium |
Medizin / Pharmazie ► Medizinische Fachgebiete ► Pharmakologie / Pharmakotherapie | |
Studium ► 1. Studienabschnitt (Vorklinik) ► Physiologie | |
Studium ► 2. Studienabschnitt (Klinik) ► Humangenetik | |
Naturwissenschaften ► Biologie ► Biochemie | |
Naturwissenschaften ► Biologie ► Genetik / Molekularbiologie | |
Naturwissenschaften ► Biologie ► Zoologie | |
Technik | |
ISBN-10 | 0-08-052155-X / 008052155X |
ISBN-13 | 978-0-08-052155-8 / 9780080521558 |
Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich