We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Online ordering will be unavailable from 17:00 GMT on Friday, April 25 until 17:00 GMT on Sunday, April 27 due to maintenance. We apologise for the inconvenience.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As the field of heritage language acquisition expands, there is a need for proficiency to compare speakers across groups and studies. Elicited imitation tasks (EITs) are efficient cost-effective tasks with a long tradition in proficiency assessment of second language (L2) learners, first language children, and adults. However, little research has investigated their use with heritage speakers (HSs), despite their oral nature, which makes them appropriate for speakers with variable literacy skills. This study is a partial replication of Solon, Park, Dehghan-Chaleshtori, Carver & Long (2022), who administered an EIT originally developed for advanced L2 learners on a group of HSs. In this study, we administered the same EIT with minor modifications to 70 HSs and 132 L2 learners of Spanish with different levels of proficiency and ran a Rasch analysis to evaluate the functioning of the task with the two groups. To obtain concurrent validity evidence, scores on the EIT were compared with participants’ performance in an oral narration; evaluated for complexity, accuracy, and fluency (CAF); and compared with a standardized oral proficiency test, the Versant Spanish Test. Results of Rasch analyses showed that the EIT was effective at distinguishing different levels of ability for both groups, and analyses showed moderate to strong correlations between CAF measures and the EIT and very strong correlations between the EIT and the Versant Spanish Test. These results provide evidence that the EIT is an efficient and adequate proficiency test for HSs and L2 learners of Spanish; its use in research settings is recommended.
When aiming to change behavior, policymakers confront the challenge of implementing behavioral interventions across contexts. However, the effectiveness of behavioral solutions often hinges on context, posing a significant hurdle to scaling interventions. This study explores the application of a behavioral pattern language approach as a means to enhance intervention efficacy and support policymakers and practitioners who seek to solve problems at scales that cross diverse contexts. The study demonstrates how a pattern language can inform contextually aware solutions, fostering collaboration and knowledge sharing among stakeholders. Additionally, the research finds practitioners deploy multiple solutions within complex systems to achieve more difficult behavioral change goals. Despite challenges related to replicability and evolving methodologies, the findings suggest that pattern languages offer a promising avenue for systematically generating and disseminating behavioral insights. This research contributes to advancing applied behavioral science by providing a structured approach for collaborative policymaking and research endeavors that are contextually relevant and effective.
This chapter identifies and explores a central feature of automated legal guidance: “simplexity.” As this chapter introduces this term, simplexity occurs when the government presents clear and simple explanations of the law without highlighting its underlying complexity or reducing this complexity through formal legal changes. Automated legal guidance inherently relies on simplexity as a result of the tension between the complexity of the law and the need of agencies to explain the law in simple terms. In creating the law, the federal government must address complex problems, and it often does so by creating legislation that is replete with errors, ambiguities, and problems. This disconnect between complex federal law and agencies’ need to explain the law to the public in simple and understandable ways forces agencies to rely on simplexity. Automated legal guidance only exacerbates the need for simplexity, because when individuals use automated online tools offered by government agencies, they expect the explanations to be even simpler, more straightforward, and easier to apply than would be the case if they were relying upon written agency publications.
Using a laboratory experiment, we investigate complexity in decision problems as a cause of failures in contingent reasoning. For this purpose, we introduce three dimensions of complexity to a decision problem: the number of contingencies, the dominance property of choices, and reducible states. Each decision problem is designed to reflect variations in complexity across the three dimensions. Experimental results show that the number of contingencies has the most significant effect on failures in contingent reasoning. The second dimension, the dominance property of choices, also has a statistically significant effect, though the effect size is smaller than in the existing literature. In contrast, the third complexity dimension has no impact; presenting the decision problem in a reduced or reducible form does not change subjects’ performance on contingent reasoning. Additionally, we examine the Power of Certainty and show its existence. This effect is particularly pronounced when the number of contingencies is large.
In Chapter 1, I explain how the book can be read and used in a nonlinear fashion, providing affordances for further exploration, comparable to the way the book approaches the creation and experience of works of art. The chapter proceeds to present a detailed advance organizer in the form of a point-by-point overview of the main messages and ideas of this book, providing a framework for the way the book can be read and used.
Systems thinking is deeply rooted in history, as far back as Aristotle. However, it has only relatively recently reemerged as an approach to help us understand and intervene in health and food systems. This is particularly salient, given its impact on environmental and population health. Whilst global food is abundant, many people cannot access affordable, healthy and culturally appropriate food. On the other hand, foods of low nutrient density are widely available. Food systems are complex and require complex thinking and approaches that allow us to consider the influence of multiple factors and how the might system respond to change. In turn, this enables the identification of ‘leverage’ points, where policies or interventions are most likely to have a sustained impact. The Foresight obesity map inspired others to adopt systems approaches to help understand the broader social, economic, environmental determinants of obesity to support intervention/policy development. Evaluation of these requires a consideration of complexity to explore why intervention goals may or may not have been successful and how relationships between components or approaches can be enhanced to support implementation and thereby increase the potential for effectiveness. Overall, approaches to understand, intervene, govern and evaluate food systems must themselves be sufficiently complex, or will ultimately be destroyed by the system it seeks to improve. This review paper aims to introduce readers to the application of systems approaches in research within the context of food systems and health, including its traditional/historical origins.
Legal risks are a significant part of a firm’s overall risk profile, and is typically guided by calculating the probability of risk and its potential magnitude. Yet this calcuation does not fully capture how risks manifest for organizations. This chapter presents a novel way of evaluating legal risk termed transformative legal risk management. Transformative legal risk management is different from traditional approaches because it incorporates a new layer of understanding legal risk. Using a four-pronged approach to risk management known as VUCA (volatility, uncertainty, complexity, and ambiguity). This chapter introduces two of the four VUCA risks: volatility and uncertainty. The chapter defines volatility, identifies sources of legal volatility, and presents responses that legal experts can use in response to volatility risk. The chapter then defines uncertainty, identifies sources of legal uncertainty, and presents responses firms can use to reduce uncertainty risk. The chapter shows how firms applying VUCA can not only minimize harm from legal risks but also elevate legal risk management into a practice that generates a competitive advantage over rivals.
Continuing the exploration of transformative legal risk management, this chapter addresses the remaining two risk dimensions that govern a VUCA environment: complexity and ambiguity. Complexity is an environment that contains numerous interconnected parts, accepts inputs, generates outputs, and develops a capacity to learn and remember. The fourth and final dimension of VUCA is ambiguity. Especially challenging for firms deploying legal knowledge, ambiguity is an environment where causes and effects propelling events forward are largely unknown, the firm does not know whether an organized system will emerge, and little historical precedent exists for determining the most appropriate course of action. The chapter defines complexity and ambiguity, explains how they are applicable to legal risk, and articulates strategies for firms to use their legal knowledge to anticipate and address complex and ambiguous legal problems.
Radiotherapy departments need to allocate appropriate treatment appointment times to maintain quality of care. Lung cancer patients typically exceed their appointment time due to their increased co-morbidities. Modern radiotherapy methods have reduced treatment time; however, different complexity factors cannot be predicted, indicating that time allocation for treatment appointments requires regular monitoring.
Methods:
Quantitative data were collected for 4 weeks, including treatment time allocated, actual treatment time required, and different complexity factors of radical lung cancer patients. Descriptive statistics were employed to analyse the treatment times recorded. The Wilcoxon signed-rank test was deployed to determine statistical significance.
Results:
Nineteen cancer patients were included in data collection, and 76 treatment times were recorded. Over 70% of patients’ treatment appointments exceeded the allocated 15 minutes. 11 out of the 15 complexity factors recorded were statistically significant. The overall treatment appointment time was statistically significant and showed that on average, patients required 3 minutes longer than allocated.
Conclusion:
Most treatments recorded exceeded their allocated appointment time. Patient complexity factors significantly influenced time, indicating that appointment allocation needs to be considered on a patient-to-patient basis. This evaluation determined that appointment allocation needs to be investigated for all cancer patients in individual departments, to ensure high-quality care.
The Stochastic Becker-DeGroot-Marschak (SBDM) mechanism is a theoretically elegant way of eliciting incentive-compatible beliefs under a variety of risk preferences. However, the mechanism is complex and there is concern that some participants may misunderstand its incentive properties. We use a two-part design to evaluate the relationship between participants’ probabilistic reasoning skills, task complexity, and belief elicitation. We first identify participants whose decision-making is consistent and inconsistent with probabilistic reasoning using a task in which non-Bayesian modes of decision-making lead to violations of stochastic dominance. We then elicit participants’ beliefs in both easy and hard decision problems. Relative to Introspection, there is less variation in belief errors between easy and hard problems in the SBDM mechanism. However, there is a greater difference in belief errors between consistent and inconsistent participants. These results suggest that while the SBDM mechanism encourages individuals to think more carefully about beliefs, it is more sensitive to heterogeneity in probabilistic reasoning. In a follow-up experiment, we also identify participants with high and low fluid intelligence with a Raven task, and high and low proclivities for cognitive effort using an extended Cognitive Reflection Test. Although performance on these tasks strongly predict errors in both the SBDM mechanism and Introspection, there is no significant interaction effect between the elicitation mechanism and either ability or effort. Our results suggest that mechanism complexity is an important consideration when using elicitation mechanisms, and that participants’ probabilistic reasoning is an important consideration when interpreting elicited beliefs.
We examine experimentally how complexity affects decision-making, when individuals choose among different products with varying benefits and costs. We find that complexity in costs leads to choosing a high-benefit product, with high costs and overall lower payoffs. In contrast, when complexity is in the benefits of the product, we cannot reject the hypothesis of random mistakes. We also examine the role of heterogeneous complexity. We find that individuals still (mistakenly) choose the high-benefit but costly product, even if cheaper and simple products are available. Our results suggest that salience is a main driver of choices under different forms of complexity.
We present an experiment where subjects sequentially receive signals about the true state of the world and need to form beliefs about which one is true, with payoffs related to reported beliefs. We attempt to control for risk aversion using the Offerman et al. (Rev Econ Stud 76(4):1461–1489, 2009) technique. Against the baseline of Bayesian updating, we test for belief adjustment underreaction and overreaction and model the decision making process of the agent as a double hurdle model where agents with inferential expectations first decide whether to adjust their beliefs and then, if so, decide by how much. We also test the effects of increased inattention and complexity on belief updating. We find evidence for periods of belief inertia interspersed with belief adjustment. This is due to a combination of random belief adjustment; state-dependent belief adjustment, with many subjects requiring considerable evidence to change their beliefs; and quasi-Bayesian belief adjustment, with aggregate insufficient belief adjustment when a belief change does occur. Inattention, like complexity, makes subjects less likely to adjust their stated beliefs, while inattention additionally discourages full adjustment.
The literature suggests that product and product-service system (PSS) design problems are characteristically different. However, there is limited empirical evidence to suggest that the design cognition specific to the respective design activities is different. This article reports the findings of a comparative study of protocols of conceptual product and PSS designing carried out in a laboratory environment by 28 pairs of experienced product designers from the manufacturing industry. First, differences between product and PSS design problems were theoretically characterized in terms of their respective sources of complexity. Based on these differences, hypotheses concerning differences in the cognitive processes of conceptual product and PSS designing were developed and empirically tested. Results indicate that PSS designing by experienced product designers is more problem-focused while product designing is more solution-focused. PSS designing was found to focus more on the design issue function and the design process formulation. Further, PSS designing was more likely to apply a depth-first search strategy, while product designing was more apt to apply a breadth-first search strategy. Results point towards the need to support the analysis of derived behavior of structure and the application of a breadth-first strategy during PSS designing by product designers.
Inflectional systems vary along multiple dimensions (number of members, size of paradigms, word class, integrative complexity, accidents of history, etc.). This makes it difficult to find significant correlations and causality relations between different properties, as attested systems usually differ in multiple ways at the same time, thus obscuring possible relations between individual variables. Here we analyze the relation between a system’s size by number of members and its morphological complexity. We do so by exploring in detail, via quantitative methods, the cognate inflectional systems of Central Pame and Chichimec (Otomanguean, Mexico), whose inflecting nominal classes differ precisely mostly with regard to their size (i.e. number of members).
In this chapter I describe how my interests in and commitment to developmental psychology grew in a multidimensional, discontinuous, nonlinear fashion. Prominent early personal, social, and intellectual influences included: coming of age in the 1960s, transitioning from fervent Catholicism to philosophy and science as my guiding stars through college and graduate school. I shape my story around the notion that “half of life is accident, and the other half is what one intentionally makes out of accident.” I began my work by focusing on the nature, causes and consequences of child maltreatment. Pursuing further research, practice, and policy interests, I conducted theory-informed longitudinal studies of the influence of risk factors (especially poverty and violence) on various dimensions of developmental processes and outcomes. More recently, I shifted focus to the design, conduct and analysis of randomized trials of school and/or neighborhood-based, social-emotional learning interventions, in the USA and then in conflict-and-crisis affected countries in the Middle East, Africa, and Latin America. Creating and supporting collaborations with students, colleagues, and organizations has been critical throughout.
The introduction describes some of the key features and the wide range of actors and activities that characterise the workings of a long-distance bus station in Accra, Ghana’s capital. It then presents two meanings of hustle that capture the station’s workings: as a noun, describing crowded, hectic, and potentially confusing situations; and as a verb, denoting precarious yet venturesome economic activities. Building on the ambivalences evoked by the different uses and perspectives of the term, it situates the significance of this study in relation to scholarly discussions of transport work, the ‘informal economy’, (auto)mobility, infrastructure, and urban social life. It then outlines the diversity of functions and types of Ghanaian bus stations, and concludes with a reflection on methodology, highlighting the value of a single-sited ethnographic approach to urban complexity and trans-local mobility, and an itinerary of the book’s chapters.
Digital twins are a new paradigm for our time, offering the possibility of interconnected virtual representations of the real world. The concept is very versatile and has been adopted by multiple communities of practice, policymakers, researchers, and innovators. A significant part of the digital twin paradigm is about interconnecting digital objects, many of which have previously not been combined. As a result, members of the newly forming digital twin community are often talking at cross-purposes, based on different starting points, assumptions, and cultural practices. These differences are due to the philosophical world-view adopted within specific communities. In this paper, we explore the philosophical context which underpins the digital twin concept. We offer the building blocks for a philosophical framework for digital twins, consisting of 21 principles that are intended to help facilitate their further development. Specifically, we argue that the philosophy of digital twins is fundamentally holistic and emergentist. We further argue that in order to enable emergent behaviors, digital twins should be designed to reconstruct the behavior of a physical twin by “dynamically assembling” multiple digital “components”. We also argue that digital twins naturally include aspects relating to the philosophy of artificial intelligence, including learning and exploitation of knowledge. We discuss the following four questions (i) What is the distinction between a model and a digital twin? (ii) What previously unseen results can we expect from a digital twin? (iii) How can emergent behaviours be predicted? (iv) How can we assess the existence and uniqueness of digital twin outputs?
Global Leaders in the 21st Century examines the current context of international management and looks at the noteworthy changes in the business and leadership contexts of globalization. A major shift appears to be taking place in the global political economy. The predominant system characterized by global economic agreements, free trade, global supply chains, and multilateral institutions is being challenged by an increase in the primacy of national interests and security. In this volatile, uncertain, complex, and ambiguous (VUCA) environment, traditional ways of managing are not entirely adequate, and global leaders need to develop new skills. This chapter introduces the concept of Mindful Global Leadership and its components of context sensitivity, perspective taking, and a process orientation. It also presents a global leadership typology-based task complexity and relationship complexity.
We begin with the theoretical and empirical foundations of happiness economics, in which the aim of economic policy is to maximize self-reported happiness of people in society. We also discuss the economic correlates of self-reported happiness. We outline some of the key insights from the literature on behavioral industrial organization, such as phishing for phools and the effects of limited attention on the pricing decisions of firms. When products have several attributes, we explain how some might be more salient than others. We also explain the effects of limited attention on economic outcomes. We introduce the basics of complexity economics. Here, people use simple rules of thumb and simple adaptive learning models in the presence of true uncertainty. We show that the aggregate systemwide outcomes are complex, characterized by chaotic dynamics, and the formation of emergent phenomena. The observed fluctuations in the system arise endogenously, rather than from stochastic exogenous shocks. We introduce two kinds of learning models – reinforcement learning and beliefs-based learning. Finally, we critically evaluate the literature on competitive double auction experiments.