Handbook of Metaheuristics (eBook)

eBook Download: PDF
2018 | 3rd ed. 2019
XX, 604 Seiten
Springer International Publishing (Verlag)
978-3-319-91086-4 (ISBN)

Lese- und Medienproben

Handbook of Metaheuristics -
Systemvoraussetzungen
181,89 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
The third edition of this handbook is designed to provide a broad coverage of the concepts, implementations, and applications in metaheuristics. The book's chapters serve as stand-alone presentations giving both the necessary underpinnings as well as practical guides for implementation. The nature of metaheuristics invites an analyst to modify basic methods in response to problem characteristics, past experiences, and personal preferences, and the chapters in this handbook are designed to facilitate this process as well. This new edition has been fully revised and features new chapters on swarm intelligence and automated design of metaheuristics from flexible algorithm frameworks. The authors who have contributed to this volume represent leading figures from the metaheuristic community and are responsible for pioneering contributions to the fields they write about. Their collective work has significantly enriched the field of optimization in general and combinatorial optimization in particular.
Metaheuristics are solution methods that orchestrate an interaction between local improvement procedures and higher level strategies to create a process capable of escaping from local optima and performing a robust search of a solution space. In addition, many new and exciting developments and extensions have been observed in the last few years. Hybrids of metaheuristics with other optimization techniques, like branch-and-bound, mathematical programming or constraint programming are also increasingly popular. On the front of applications, metaheuristics are now used to find high-quality solutions to an ever-growing number of complex, ill-defined real-world problems, in particular combinatorial ones. This handbook should continue to be a great reference for researchers, graduate students, as well as practitioners interested in metaheuristics.


Michel Gendreau is Department Chair and Professor of Operations Research in the Department of Mathematics and Industrial Engineering of Polytechnique Montréal (Canada). He received his Ph.D. from University of Montreal in 1984. His main research area is the application of operations research methods to a wide range of problem areas: transportation and logistics systems planning and operation, energy production and storage, healthcare, and telecommunications. Dr. Gendreau has published more than 300 papers in peer-reviewed journals and conference proceedings. He was the Editor in chief of Transportation Science from 2009 to 2014 and he is a member of several other editorial boards. Dr. Gendreau has received several research grants and awards, including the Robert Herman Lifetime Achievement Award of the Transportation Science & Logistics Society of INFORMS and the Merit Award of the Canadian Operational Research Society. He was elected Fellow of INFORMS in 2010.
Jean-Yves Potvin is Professor at Université de Montréal in the Department of Computer Science and Operations Research. He is also Assistant Director of the Interuniversity Research Centre on Enterprise Networks, Logistics and Transportation (CIRRELT). His research integrates operations research and artificial intelligence techniques. More precisely, he is interested in the development of adaptive algorithms based on local search-based metaheuristics, genetic algorithms and neural networks to address different types of vehicle routing problems. He also works on parallel implementations of these algorithms for real-time applications, like dynamic vehicle dispatching.

Michel Gendreau is Department Chair and Professor of Operations Research in the Department of Mathematics and Industrial Engineering of Polytechnique Montréal (Canada). He received his Ph.D. from University of Montreal in 1984. His main research area is the application of operations research methods to a wide range of problem areas: transportation and logistics systems planning and operation, energy production and storage, healthcare, and telecommunications. Dr. Gendreau has published more than 300 papers in peer-reviewed journals and conference proceedings. He was the Editor in chief of Transportation Science from 2009 to 2014 and he is a member of several other editorial boards. Dr. Gendreau has received several research grants and awards, including the Robert Herman Lifetime Achievement Award of the Transportation Science & Logistics Society of INFORMS and the Merit Award of the Canadian Operational Research Society. He was elected Fellow of INFORMS in 2010.Jean-Yves Potvin is Professor at Université de Montréal in the Department of Computer Science and Operations Research. He is also Assistant Director of the Interuniversity Research Centre on Enterprise Networks, Logistics and Transportation (CIRRELT). His research integrates operations research and artificial intelligence techniques. More precisely, he is interested in the development of adaptive algorithms based on local search-based metaheuristics, genetic algorithms and neural networks to address different types of vehicle routing problems. He also works on parallel implementations of these algorithms for real-time applications, like dynamic vehicle dispatching.

Preface to the Third Edition 6
Preface to the Second Edition 8
Preface to the First Edition 10
Contents 13
Contributors 15
1 Simulated Annealing: From Basics to Applications 19
1.1 Introduction 20
1.2 Basics 20
1.2.1 Local Search (or Monte Carlo) Algorithms 21
1.2.2 Metropolis Algorithm 22
1.2.3 Simulated Annealing (SA) Algorithm 23
1.3 Theory 25
1.3.1 Statistical Equilibrium 25
1.3.2 Asymptotic Convergence 29
1.4 Practical Issues 32
1.4.1 Finite-Time Approximation 32
1.4.2 Geometric Cooling 33
1.4.3 Cooling in Polynomial Time 33
1.4.3.1 Initial Temperature c0 34
1.4.3.2 Decay of the Control Parameter 34
1.4.3.3 Length of Markov Chains 35
1.4.3.4 Stopping Criterion 35
1.4.3.5 Summary 36
1.4.4 Simulation-Based Evaluation 37
1.5 Illustrative Applications 38
1.5.1 Knapsack Problem 39
1.5.1.1 Mathematical Modeling 39
1.5.1.2 Simulated Annealing Implementation 40
1.5.2 Traveling Salesman Problem 41
1.5.2.1 Mathematical Modeling 43
1.5.2.2 Simulated Annealing Implementation 44
1.6 Large-Scale Aircraft Trajectory Planning 46
1.6.1 Mathematical Modeling 46
1.6.2 Computational Experiments with SA 50
1.7 Conclusion 52
References 53
2 Tabu Search 54
2.1 Introduction 54
2.2 The Classical Vehicle Routing Problem 55
2.3 Basic Concepts 56
2.3.1 Historical Background 56
2.3.2 Tabu Search 57
2.3.3 Search Space and Neighborhood Structure 57
2.3.4 Tabus 59
2.3.5 Aspiration Criteria 60
2.3.6 A Template for Simple Tabu Search 60
2.3.7 Termination Criteria 61
2.3.8 Probabilistic TS and Candidate Lists 61
2.4 Intermediate Concepts 62
2.4.1 Intensification 62
2.4.2 Diversification 63
2.4.3 Allowing Infeasible Solutions 64
2.4.4 Surrogate and Auxiliary Objectives 64
2.5 Advanced Concepts 65
2.6 Key References 66
2.7 Tricks of the Trade 66
2.7.1 Getting Started 66
2.7.2 More Tips 67
2.7.3 Additional Tips for Probabilistic TS 67
2.7.4 Parameter Calibration and Computational Testing 68
2.8 Conclusion 69
References 69
3 Variable Neighborhood Search 73
3.1 Introduction 74
3.2 Basic Schemes 75
3.3 Some Extensions 81
3.4 Changing Formulation Within VNS 83
3.4.1 Variable Neighborhood-Based Formulation Space Search 83
3.4.2 Variable Formulation Search 84
3.5 Primal-Dual VNS 87
3.6 VNS for Mixed Integer Linear Programming 89
3.6.1 Variable Neighborhood Branching 89
3.6.2 VNDS Based Heuristics for MILP 92
3.6.2.1 VNDS for 0-1 MILPs with Pseudo-Cuts 92
3.6.2.2 A Double Decomposition Scheme 95
3.6.2.3 Comparison 97
3.7 Variable Neighborhood Search for Continuous Global Optimization 98
3.8 Variable Neighborhood Programming (VNP): VNS for Automatic Programming 101
3.9 Discovery Science 105
3.10 Conclusions 106
References 109
4 Large Neighborhood Search 114
4.1 Introduction 114
4.1.1 Example Problems 115
4.1.2 Neighborhood Search 116
4.2 Large Neighborhood Search 117
4.3 Adaptive Large Neighborhood Search 120
4.3.1 Designing an ALNS Algorithm 123
4.3.2 Properties of the ALNS Framework 125
4.3.3 Relation to Other Metaheuristics 126
4.3.4 Parallelism 126
4.4 Applications of LNS and ALNS 127
4.4.1 Vehicle Routing Applications 127
4.4.2 Other Applications 129
4.5 Very Large-Scale Neighborhood Search 133
4.5.1 Variable-Depth Methods 134
4.5.2 Network Flow-Based Improvement Algorithms 135
4.5.2.1 Neighborhoods Defined by Cycles 135
4.5.2.2 Neighborhoods Defined by Paths 135
4.5.2.3 Neighborhoods Defined by Assignments and Matching 136
4.5.3 Other VLSN Algorithms 136
4.6 Conclusion 136
References 137
5 Iterated Local Search: Framework and Applications 143
5.1 Introduction 144
5.2 Iterating a Local Search 145
5.2.1 General Framework 145
5.2.2 Random Restart 146
5.2.3 Searching in S* 147
5.2.4 Iterated Local Search 148
5.3 Getting High Performance 150
5.3.1 Initial Solution 151
5.3.2 Perturbation 153
5.3.2.1 Perturbation Strength 154
5.3.2.2 Adaptive Perturbations 155
5.3.2.3 More Complex Perturbation Schemes 156
5.3.2.4 Speed 156
5.3.3 Acceptance Criterion 158
5.3.3.1 Example 1: TSP 159
5.3.3.2 Example 2: QAP 160
5.3.4 Local Search 161
5.3.5 Global Optimization of ILS 162
5.4 Selected Applications of ILS 164
5.4.1 ILS for the TSP 164
5.4.2 ILS for Other Routing Problems 166
5.4.3 ILS for Scheduling Problems 167
5.4.4 ILS for Other Problems 169
5.4.5 Summary 171
5.5 Relation to Other Metaheuristics 171
5.5.1 Neighborhood-Based Metaheuristics 172
5.5.2 Multi-Start-Based Metaheuristics 173
5.6 Conclusions 175
References 176
6 Greedy Randomized Adaptive Search Procedures: Advances and Extensions 183
6.1 Introduction 184
6.2 Construction of the Restricted Candidate List 186
6.3 Alternative Construction Mechanisms 190
6.3.1 Random Plus Greedy and Sampled Greedy Construction 191
6.3.2 Reactive GRASP 192
6.3.3 Cost Perturbations 192
6.3.4 Bias Functions 193
6.3.5 Intelligent Construction: Memory and Learning 194
6.3.6 POP in Construction 194
6.3.7 Lagrangean GRASP Heuristics 195
6.3.7.1 Lagrangean Relaxation and Subgradient Optimization 195
6.3.7.2 A Template for Lagrangean Heuristics 196
6.3.7.3 Lagrangean GRASP 198
6.4 Path-Relinking 202
6.4.1 Forward Path-Relinking 206
6.4.2 Backward Path-Relinking 206
6.4.3 Back and Forward Path-Relinking 206
6.4.4 Mixed Path-Relinking 207
6.4.5 Truncated Path-Relinking 207
6.4.6 Greedy Randomized Adaptive Path-Relinking 208
6.4.7 Evolutionary Path-Relinking 210
6.4.8 External Path-Relinking and Diversification 211
6.5 Restart Strategies 212
6.6 Extensions 218
6.7 Applications 219
6.8 Concluding Remarks 220
References 222
7 Intelligent Multi-Start Methods 235
7.1 Introduction 236
7.2 An Overview 237
7.2.1 Memory Based Designs 237
7.2.2 GRASP 240
7.2.3 Constructive Designs 242
7.2.4 Hybrid Designs 244
7.2.5 Theoretical Analysis 246
7.3 A Classification 247
7.4 The Maximum Diversity Problem 249
7.4.1 Multi-Start Without Memory (MSWoM) 250
7.4.2 Multi-Start With Memory (MSWM) 251
7.4.3 Experimental Results 252
7.5 Conclusion 254
References 254
8 Next Generation Genetic Algorithms: A User'sGuide and Tutorial 258
8.1 Introduction 258
8.2 Classic Simple Genetic Algorithms (SGA) 260
8.2.1 The Population and Selection 261
8.2.2 Tournament Selection 264
8.3 Steady State and Monotonic Genetic Algorithms 264
8.4 The Demise of Hyperplane Sampling Theory 265
8.5 Gray Box Optimization 267
8.6 The k-Bounded Pseudo-Boolean Functions 267
8.6.1 Tunneling Between Optima 269
8.6.2 How to Select Improving Moves in Constant Time 271
8.6.3 Looking Multiple Steps Ahead 274
8.7 The Traveling Saleman (TSP): Tunneling Between Optima 275
8.8 An Iterated Hybrid Genetic Algorithm 277
8.8.1 The Limitations of Tunneling and Partition Crossover 279
8.9 The EAX Algorithms for the TSP 279
8.10 Massively Parallel Genetic Algorithms 282
8.11 Conclusions 283
References 284
9 An Accelerated Introduction to Memetic Algorithms 288
9.1 Introduction and Historical Notes 288
9.2 Memetic Algorithms 290
9.2.1 Basic Concepts 291
9.2.2 Search Landscapes 292
9.2.3 Local vs. Population-Based Search 295
9.2.4 Recombination 297
9.2.5 A Memetic Algorithm Template 300
9.2.6 Designing an Effective Memetic Algorithm 303
9.3 Algorithmic Extensions of Memetic Algorithms 305
9.3.1 Multiobjective Memetic Algorithms 305
9.3.2 Continuous Optimization 306
9.3.3 Memetic Computing Approaches 307
9.3.4 Self- Memetic Algorithms 308
9.3.5 Memetic Algorithms and Complete Techniques 310
9.4 Applications of Memetic Algorithms 310
9.5 Conclusions 311
References 313
10 Ant Colony Optimization: Overview and Recent Advances 323
10.1 Introduction 323
10.2 Approximate Approaches 324
10.2.1 Construction Algorithms 325
10.2.2 Local Search Algorithms 326
10.3 The ACO Metaheuristic 327
10.3.1 Problem Representation 328
10.3.2 The Metaheuristic 329
10.4 History 331
10.4.1 Biological Analogy 331
10.4.2 Historical Development 333
10.4.2.1 The First ACO Algorithm: Ant System and the TSP 333
10.4.2.2 Ant System and Its Extensions 334
10.4.2.3 Applications to Dynamic Network Routing Problems 336
10.4.2.4 Towards the ACO Metaheuristic 336
10.5 Applications 337
10.5.1 Example 1: The Single Machine Total Weighted Tardiness Scheduling Problem (SMTWTP) 337
10.5.2 Example 2: The Set Covering Problem (SCP) 338
10.5.3 Example 3: AntNet for Network Routing Applications 339
10.5.4 Applications of the ACO Metaheuristic 341
10.5.5 Main Application Principles 343
10.5.5.1 Definition of Solution Components and Pheromone Trails 343
10.5.5.2 Balancing Exploration and Exploitation 344
10.5.5.3 ACO and Local Search 345
10.5.5.4 Heuristic Information 345
10.6 Developments 346
10.6.1 Non-standard Applications of ACO 346
10.6.1.1 Multi-Objective Optimization 346
10.6.1.2 Dynamic Versions of NP-hard Problems 347
10.6.1.3 Stochastic Optimization Problems 347
10.6.1.4 Continuous Optimization 348
10.6.2 Algorithmic Developments 348
10.6.2.1 Hybridizations of ACO with Other Metaheuristics 349
10.6.2.2 Hybridizations of ACO with Branch-and-Bound Techniques 349
10.6.2.3 Combinations of ACO with Constraint and Integer Programming Techniques 350
10.6.3 Parallel Implementations 350
10.6.4 Theoretical Results 351
10.7 Conclusions 353
References 353
11 Swarm Intelligence 364
11.1 Introduction 364
11.2 Biological Examples 367
11.3 Particle Swarm Optimization 369
11.3.1 Inertia Weighted and Constricted PSOs 372
11.3.2 Memory-Swarm vs. Explorer-Swarm 373
11.3.3 Particle Dynamics Through a Simplified Example 374
11.3.3.1 One Particle 375
11.3.3.2 Two Particles 377
11.4 PSO Variants 377
11.4.1 Fully Informed PSO 378
11.4.2 Bare-Bones PSO 378
11.4.3 Binary PSO 379
11.4.4 Discrete PSO 379
11.4.5 SPSO-2011 380
11.4.6 Other PSO Variants 382
11.5 PSO Applications 382
11.5.1 Multiobjective Optimization 383
11.5.2 Optimization in Dynamic Environments 385
11.5.3 Multimodal Optimization 387
11.6 PSO Theoretical Works 388
11.7 Other SI Applications 389
11.7.1 Swarm Robotics 389
11.7.2 Swarm Intelligence in Data Mining 390
11.8 Conclusion 391
References 391
12 Metaheuristic Hybrids 396
12.1 Introduction 397
12.2 Classification 398
12.3 Finding Initial or Improved Solutions by Embedded Methods 401
12.4 Multi-Stage Approaches 402
12.5 Decoder-Based Approaches 405
12.6 Solution Merging 407
12.7 Strategic Guidance of Metaheuristics by Other Techniques 409
12.7.1 Using Information Gathered by Other Algorithms 409
12.7.2 Enhancing the Functionality of Metaheuristics 411
12.8 Strategic Guidance of Other Techniques by Metaheuristics 413
12.9 Decomposition Approaches 415
12.9.1 Exploring Large Neighborhoods 415
12.9.2 Hybrids Based on MIP Decomposition Techniques 417
12.9.2.1 Lagrangian Decomposition 418
12.9.2.2 Column Generation 418
12.9.2.3 Benders Decomposition 419
12.9.3 Using Metaheuristics for Constraint Propagation 420
12.10 Summary and Conclusions 420
References 422
13 Parallel Metaheuristics and Cooperative Search 429
13.1 Introduction 429
13.2 Metaheuristics and Parallelism 430
13.2.1 Sources of Parallelism 430
13.2.2 Performance Measures 432
13.2.3 Parallel Metaheuristics Strategies 433
13.3 Low-Level Parallelization Strategies 434
13.4 Domain Decomposition 437
13.5 Independent Multi-Search 438
13.6 Cooperative Search 439
13.6.1 pC/KS Synchronous Cooperative Strategies 442
13.6.2 pC/C Asynchronous Cooperative Strategies 444
13.7 pC/KC Cooperation Strategies: Creating New Knowledge 448
13.8 Conclusions 454
References 456
14 A Classification of Hyper-Heuristic Approaches: Revisited 462
14.1 Introduction 463
14.2 Previous Classifications 464
14.3 The Proposed Classification and Definition 465
14.4 Heuristic Selection Methodologies 467
14.4.1 Approaches Based on Construction Low-Level Heuristics 467
14.4.1.1 Representative Examples 468
14.4.2 Approaches Based on Perturbation Low-Level Heuristics 471
14.4.2.1 Representative Examples 472
14.4.3 Recent Research Trends 474
14.4.3.1 Software Frameworks 475
14.4.3.2 Multi-Objective 475
14.4.3.3 Theoretical and Foundational Studies 475
14.5 Heuristic Generation Methodologies 476
14.5.1 Representative Examples 477
14.5.2 Some Recent Examples 480
14.6 Conclusions 481
References 481
15 Reactive Search Optimization: Learning While Optimizing 487
15.1 Introduction 487
15.2 Different Reaction Possibilities 491
15.2.1 Reactive Prohibitions 491
15.2.2 Reacting on the Neighborhood 495
15.2.3 Reacting on the Annealing Schedule 497
15.2.4 Reacting on the Objective Function 500
15.2.5 Reactive Schemes in Population-Based Methods 502
15.3 Applications of Reactive Search Optimization 504
15.3.1 Classic Combinatorial Tasks 505
15.3.1.1 Knapsack and Related Problems 505
15.3.1.2 Problems on Graphs 505
15.3.1.3 Vehicle Routing Problems 506
15.3.1.4 Satisfiability and Related Problems 507
15.3.2 Neural Networks and Learning Systems 507
15.3.3 Continuous Optimization 508
15.3.4 Real-World Applications 509
15.3.4.1 Power Distribution Networks 509
15.3.4.2 Industrial Production and Delivery 509
15.3.4.3 Telecommunication Networks 510
15.3.4.4 Vehicle Routing and Dispatching 510
15.3.4.5 Industrial and Architectural Design 510
15.3.4.6 Biology 511
15.4 Conclusion 511
References 512
16 Stochastic Search in Metaheuristics 520
16.1 Introduction 520
16.2 General Framework 521
16.3 Convergence Results 524
16.4 Runtime Results 526
16.4.1 Some Methods for Runtime Analysis 527
16.4.2 Instance Difficulty and Phase Transitions 529
16.4.3 Some Notes on Special Runtime Results 531
16.5 Parameter Choice 532
16.6 No-Free-Lunch Theorems 534
16.7 Fitness Landscape Analysis 536
16.8 Black-Box Optimization 538
16.9 Stochastic Search Under Noise 539
16.10 Stochastic Search and Robustness 541
16.11 Conclusions 542
References 543
17 Automated Design of Metaheuristic Algorithms 548
17.1 Introduction 549
17.2 Automatic Algorithm Configuration 550
17.2.1 Design Choices for Metaheuristic Algorithms 551
17.2.2 Parameters and the Configuration Problem 553
17.2.3 Automatic Algorithm Configuration 555
17.2.3.1 ParamILS 558
17.2.3.2 SMAC 559
17.2.3.3 irace 560
17.3 Towards Metaheuristic Algorithm Design 561
17.3.1 Basic Uses of Configurators 562
17.3.2 Advanced Uses of Configurators 564
17.4 Examples 566
17.4.1 Improving the Anytime Behavior of Metaheuristics 566
17.4.2 Multi-Objective Ant Colony Optimization 568
17.4.3 Automated Design of Hybrid Stochastic Local Search Algorithms 569
17.5 Relevant Connections and Related Work 572
17.5.1 Online Parameter Control 572
17.5.2 Algorithm Portfolios and Algorithm Selection 573
17.5.3 Automated Design of Metaheuristics/Metaheuristic Algorithm 575
17.5.4 Other Related Work 576
17.6 Conclusions 577
References 578
18 Computational Comparison of Metaheuristics 587
18.1 Introduction 588
18.2 The Testbed 588
18.2.1 Using Existing Testbeds 588
18.2.2 Developing New Testbeds 589
18.2.2.1 Goals in Creating the Testbed 589
18.2.2.2 Accessibility of New Test Instances 590
18.2.2.3 Problem Instances with Known Optimal Solutions 591
18.2.3 Problem Instance Classification 591
18.3 Parameters 592
18.3.1 Parameter Space Visualization and Tuning 593
18.3.2 Parameter Interactions 595
18.3.3 Fair Testing Involving Parameters 595
18.4 Solution Quality Comparisons 596
18.4.1 Solution Quality Metrics 596
18.4.2 Comparative Performance on Different Types of Problem Instances 598
18.5 Runtime Comparisons 598
18.5.1 Runtime Limits Using the Same Hardware 599
18.5.2 Runtime Limits Using Different Hardware 600
18.5.3 Runtime Growth Rate 601
18.5.4 Alternatives to Runtime Limits 601
18.6 Parallel Algorithms 602
18.6.1 Evaluating Parallel Metaheuristics 603
18.6.2 Comparison When Competing Approaches Can Be Run 604
18.6.3 Comparison When Competing Approaches Cannot Be Run 605
18.7 Conclusion 607
References 607

Erscheint lt. Verlag 20.9.2018
Reihe/Serie International Series in Operations Research & Management Science
International Series in Operations Research & Management Science
Zusatzinfo XX, 604 p. 98 illus., 56 illus. in color.
Verlagsort Cham
Sprache englisch
Themenwelt Wirtschaft Allgemeines / Lexika
Wirtschaft Betriebswirtschaft / Management Planung / Organisation
Schlagworte Heuristics • hyper-heuristic • Memetic Algorithms • Metaheuristics • Operations Research • Optimization • reactive search • Search Optimization • Stochastic search • Swarm intelligence • Tabu Search
ISBN-10 3-319-91086-8 / 3319910868
ISBN-13 978-3-319-91086-4 / 9783319910864
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 9,6 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Ein Lehr- und Managementbuch

von Dietmar Vahs

eBook Download (2023)
Schäffer-Poeschel Verlag
44,99
Ein Lehr- und Managementbuch

von Dietmar Vahs

eBook Download (2023)
Schäffer-Poeschel Verlag
44,99